Cannes 2016: Does copy testing kill – or create – great work?

13497644_10155239138858539_5040159966607492617_o

By Brent Choi

Great work in advertising is as subjective as which movie should win the Oscar for Best Picture (Birdman? Really?)

For the purpose of this article (and it being Cannes week), I’m equating great work to winning a Lion at the festival. As we approach the final night, when the Film Lions are announced, is there any chance a few Canadian spots will win? I hope so. Any that went through copy testing? Let’s look back over the past couple of years to see if we can get any hints.

Last year, Canada picked up five Film Lions for work for Always, Moms Demand Action for Gun Sense in America, Societé de l’assurance automobile du Québec, PFLAG and Skittles.

In 2014, there were four, with John St.’s “Exfeariential” film, Krispy Kernels, longer-form work for Volkswagen, and WestJet’s “Christmas Miracle.”

Guess how many were pre-tested? Three? Five? After a bit of digging, I discovered that none of those winning spots went through traditional copy testing, and only one went through any type of pre-testing, and it was for virality. That said, some were longer format, a few were PSAs/pro bono/self-promo and others were true brand ads that rarely get tested (more on that later).

So this brings up two important questions.

First, where are the great ads that went through copy testing? I think we all know the answer to this. Copy testing tends to push work into the best practices territory, which fundamentally conflicts with Cannes’ desire for convention-defying work.

Second, why aren’t we testing our “award-winning” work? This is the more interesting discussion. Do we believe research can improve the “award-winning” ads to get better? And perhaps more importantly, do we believe “award-winning” ads will drive sales results?

There is a ton of research (yes, research – I’m such a hypocrite) – including the IPA’s “Long and Short of It,” which tested 1,000 campaigns over 30 years – that undeniably shows the correlation between award-winning work leading to 10 times the effectiveness compared to ads that don’t win awards.

Here’s the catch and the crux. Not all efforts for great creative end up winning awards. In fact, most don’t. And following best practices instead should net you some positive results, or at least defensible results.

So the safe thing to do is to, well, play it safe. But is it the best thing to do?

A Millward Brown article (“Advertising: How to maximize the long term effects”) says that “Advertisers tend to focus on the short term effects of their ad campaign, since they are easiest to measure.” However, as the article’s title points out, focusing on advertising that delivers long term is key:  “…it is essential for marketers to understand the ways in which advertising delivers long-term sales growth … on average (as high as five times) greater than the short-term return.”

And yes, I am conveniently grouping long-term results with award-winning/brand preference vs. short-term product promotion.

So let me get this straight. Millward Brown has pre-testing measures available to identify long-term results but marketers prefer to only use the short-term ones? And they also have evidence that long-term has much greater return?

Wait. What? Have I been upset with the wrong people all these years?

In April, Hillary Clinton spoke about big corporations not investing in ideas that will benefit them five years out because shareholders would be up in arms if it negatively impacted their share-price that quarter. My reaction was, “That’s such typical corporate America BS.”

I guess, by association, I am guilty of that very same thing.

In a strategy article in 2013, Leo Burnett Toronto’s CEO and CCO Judy John – a Film judge that year – said, “In Cannes it is called Film, it’s not called TV commercials. It’s about entertaining and (John) Hegarty always says it is about persuasion first and then promotion. I find in Canada we are promoting and sometimes not even trying to persuade. We aren’t inviting the viewer to come in and be entertained.”

Most media plans now, for production efficiency, also include TVC on pre-roll (with the ability to skip), which makes this saying gospel: “Don’t interrupt what I’m watching, be something I want to watch.”

Do we, should we, ask this question in testing: “Were you entertained by this?”

Going back to the IPA’s analysis, if we agree that award-winning work creates great results, how can we best test for work that will win?

That brings us to another point. We test with consumers, but award-winning work is “judged” by our industry’s top creatives. So therefore, wouldn’t the best testing be with 10 or so of the top creative directors? After all, they’re the ones on the jury, not consumers.

So, as we approach the final night of our industry’s most prestigious awards festival, perhaps the question we should be asking isn’t “Should we test our work?” but rather “How should we be testing it and with whom?”

Brent ChoiBrent Choi is chief creative officer of J. Walter Thompson Canada and New York.

Featured image via the Cannes Lions International Festival of Creativity Facebook page.