Skip to main content
Industry Contributor 16 Sep 2022 - 4 min read

Bad science will derail good science and no one will get to Rome: Clients don't trust awards data – Byron Sharp isn't wrong

By Waqas Tahir - Head of Brand Strategy, Loud

We should not accept an analysis based on advertising awards as good science just because it is directionally similar to sound scientific analysis, says Loud's Waqas Tahir, as debate sparked by Byron Sharp's skewering of the data used to create Binet & Field's 60:40 rule rumbles on. He used to include charts from The Long and Short of It in presentations – until a client told him to take them out, because the data was not credible enough to base decisions on.

Peeing in the pool

You might remember watching Al Gore’s An Inconvenient Truth and being bowled over by its undeniable evidence and logic. It was released in 2006 and unlike how it was received then, most people know today that its impact was largely a result of bad extrapolations and excitable line graphs.

The documentary has tainted climate science and, at least in the eyes of the climate sceptics, it is evidence that environmentalists will champion bad science when it serves their cause.

It doesn’t matter that there’s a lot of good science that makes the same points more cautiously because once someone’s peed in the pool no one’s going near it even though it’s mainly clean water.

To be clear, the documentary isn’t the sole reason or even a major contributor to climate scepticism, but it certainly has become the poster child of bad environmental science despite being directionally similar to good environmental science.  

Myth versus reality

Similarly in advertising should we accept an analysis based on a biased dataset of awards and success stories because it is directionally similar to sound scientific analysis? More generally, should we accept the work of social and behavioural psychology because it feels directionally true even when the findings suffer from a crisis of replication? Or for that matter should we accept any other work that comes out of the Ehrenberg Bass Institute just because it is from an institution that has produced sound work in the past?

The short answer is no. The people and their intent are of no significance to the scientific process. There are no allies in science and it’s a treacherous business. If I can’t show up with good evidence of legit studies that are vetted by peers in the scientific community then I have no right to demand that my opinions be branded scientific truths.

The emphasis on scientific community is particularly crucial. We are not the scientific community. We are advertising practitioners who have minimal expertise in determining scientific truths and in fact are responsible for spawning some of the ugliest myths in marketing.

Don’t get me wrong. I love my work as a brand strategist and I’ve dealt with a fair bit of data and created many charts over the years, but none have ever been interrogated by a scientist or were even grounded in particularly scientific principles. Most of the industry’s first principles have been an endless list of bad heuristics including ‘winning hearts and minds’, ‘turning the category on its head’, ‘kickstarting a movement’ and countless other hollow ideas based on an awards obsessed industry that would champion any ideal as long as it would help them win some metal.

Byron vs Binet & Field

As a strategist I’ve always had a desperate need for sound principles based on good science, but I had very little to work with for a very long time. The work of David Ogilvy and his focus on evidence-based rules was exceptional but it was from a different time. Randall Rothenberg’s Where the suckers moon took an objective view thereby alerting me to the fallacies rife in our industry’s discourse – there were many other good texts; however, there never was a coherent set of scientific truths until How Brands Grow came along. It had the necessary rigour and parched dry style characteristic of substantial work and I’ve held on to it since I first read it. I’ve asked every marketer I’ve met to read it at least once and the key findings are an essential component of every presentation I write.

Do I ever mix it up with the findings from The Long and Short of It? I used to. Particularly the famous long and short chart but I had to stop when a client asked me to exclude it from my presentation because it was too neat a pattern to convince anyone. I was also constantly reminded of how notorious awards were when my audience lost interest upon finding the source of said charts. This tainted and derailed the whole presentation and it didn’t matter that the conclusions were similar and consistent with good science. 

What I’ve experienced in my presentations is a microcosm of what happens to good science when it’s mixed with questionable data. The Long and the Short of It might be right in its conclusions, but we can’t champion it just because it’s consistent with Sharp’s work. That’s not what replicability is. How you get to those conclusions is the entire point of science.  

What do you think?

Search Mi3 Articles