Skip to main content
Market Voice 19 May 2025 - 2 min read

Why failing media experiments can be the best thing you do this year

By Tom Sheppard - General Manager, Data & Technology, Atomic 212° | Partner Content

If you’ve sat in a marketing meeting lately, chances are you’ve heard someone drop the word “experimentation.” Maybe it was paired with “incrementality,” “test and learn,” or the ever-trendy “triangulation.” The truth is, with traditional media measurement wobbling under the weight of privacy changes and platform walled gardens, experimentation has quickly gone from nice-to-have to must-do. But while running experiments sounds easy on a slide, the real work and the real value comes in what you do when they don’t go to plan. Spoiler: that might actually be when you learn the most.

Why experiments are suddenly everywhere

There’s a good reason experiments are back in vogue.

Media mix modelling (MMM) needs years of clean spend data. Attribution is patchy. Brand tracking takes time. And navigating consent requirements is a minefield. In contrast, a well-structured experiment lets you isolate what actually drove a change without needing perfect historical data.

Platforms like Google and Meta have made it easier than ever, offering free tools to set up geo tests / experiments. It’s now possible to run robust tests quickly and cheaply.

More importantly, experiments help answer the most valuable question in marketing: what would’ve happened if we’d done nothing?

That’s the heart of incrementality. And for teams still getting comfortable with the concept, experiments are the most straightforward place to start.

But let’s not pretend it’s always smooth sailing

Experiments might sound clean in theory, but real-world execution gets messy fast.

Sometimes your test doesn’t show a statistically significant lift – but not because the media didn’t work.

There’s also the time factor. Setting up a clean experiment takes planning. And while they’re quicker than MMM, they’re still not instant. Then there’s the cultural challenge: when a test “fails” or doesn't “go as intended”; it can feel like a step backwards especially in performance-hungry environments.

The real value Is in what didn’t work

When a test validates your hypothesis, it gives you confidence to scale. The case study writes itself.

But when it doesn’t? That’s where the gold is.

Failed experiments force teams to pause and ask better questions. Why didn’t that channel drive uplift? Was the creative wrong? Did we misread the audience? Did we measure the right thing? What did we get wrong?

This kind of critical thinking is where true growth lives. It’s uncomfortable, sure. But it’s also what separates surface-level marketing from teams that genuinely get sharper over time.

In fact, the most useful conversations I’ve had around media effectiveness came after something didn’t perform as expected. That’s when people are engaged. That’s when assumptions get challenged. That’s when the learning sticks. Because when you find gold the losses from the failed experiments exceed all the “failed tests” and as such rewards the resilient.

So should you test everything?

No, and that’s an important distinction.

Testing without a clear purpose is just noise. Every experiment should be grounded in a real business question with critical insights backing the insight up. What are we trying to learn? What decision will this help us make? What’s the opportunity cost if we do nothing?

It’s also worth knowing when an experiment isn’t the right tool. If you need quick directional feedback, you might get there faster with directional surveys or qualitative research. Experiments are powerful, but they’re not always the fastest route to insight.

Final thoughts

In a world where the clean-cut ROI number is harder to pin down than ever, structured experimentation is one of the few ways to get clear, directional truth. And while it’s tempting to chase only the success stories, the real wins often come from what doesn’t work.

You can’t optimise what you don’t question. And you won’t grow if you only play it safe.

So fail, but fail with purpose. Then learn, refine, and go again. That’s how you build media strategies that last.

What do you think?

Search Mi3 Articles