![](https://marketingweek.imgix.net/content/uploads/2024/04/23154850/testing.jpg?auto=compress,format&q=60&w=736&h=406)
Sky’s team was already thinking of Christmas in the summertime last year. Sky was thinking about Christmas ads, specifically for the subscription channels it would be running during this crucial festive season. Sky Cinema was at the top of their list, a premium subscription movie channel. Six different Christmas creative campaign ideas were on the table. Each campaign had internal sponsors and its own merits. Each took a different path.
In the past, the options presented would cause a major political battle as the different groups within the agency and client teams argued for the best option.
Sky’s team of marketers is much better. All six were pre-tested. Each of the six was made into an animated film and evaluated by System1. System1 scored each animatic.
The proposal was not flashy or complicated. They knew they wanted something more impressive and eye-catching. Ben Case, Sky’s managing director of consumer strategy, said that Sky was so satisfied with the result and the quality of the advertisement, they immediately discarded the five other options. Work on the Christmas 2023 ad started within days after the research.
The pre-testing of advertising has been a longstanding marketing practice that divides marketers. The majority of CMOs that I’ve worked with will say it is unnecessary. They do not conduct research, but rather only for the purpose of establishing their strategy. They brief their agencies. They trust these agencies and depend on them for their creative leaps. Research cycles could compromise or shorten leaps. It makes no sense to use data so late in the creative process for something that is as volatile as advertising.
The other half, however, is in the exact opposite direction. Market-oriented. They are research-oriented. Why not test the largest marketing investment you make each year? Why would you think advertising is any different? If consumer research helps with product or pricing development, then why not the same for advertising?
Rejecting pre-testing shows a complete lack of knowledge.
There is, of course, the perspective from the agency, which is, like agencies, a bit outdated, precious, and fails to see the big picture. The big idea is reduced by research. Reduces disruptive power. In general, it puts restrictions on models that are best left alone. This was beautifully illustrated in this week’s episode of On Strategy. Fergus O’Carroll, host of the On Strategy podcast, interviewed Sofia Colucci. Sofia is CMO at Molson Coors. They both agreed on the dangers of following predetermined approaches and pre-testing at the cost of being unique.
In his introduction to the show, O’Carroll warns that we must be cautious not to buy in to this. Our job is to grow the business, not beat the test. “If you follow a pattern or formula, it will lead to lack of creativity.”
Prior to a few short years, I’d have agreed with O’Carroll’s research-based approach and trusted my agencies for the final creative product. Pre-testing is no longer the norm. It is changing. It’s now time to accept that pre-testing has changed.
Why the best move for a marketer is to not change.
Evolution of testing
What has changed since then? What has changed? The second is the process of pre-testing. Pre-testing’s predictive abilities is the third thing to consider. Fourth, the time it takes to do all this.
Everyone agrees that pre-testing was a step away from being pointless. The storyboard would be created by a guy wearing flares and a handlebar mustache. Someone in a tank-top smoking a cigar would then act out this scenario in front of a few target customers in the room. Pre-testing was rendered pointless because the thing that was being tested would have never been used in the final ad.
Things change. It’s now possible to quickly create a scene-for-scene animation without spending a lot of money. The ad is not there, but the animation looks nine hundred per cent closer to the original.
Watch the Carlsberg video to see the example.
Compare the finished advertisement with this:
In the old days, after the storyboard had been created, the traditional pre-testing would then assess it by using what could only be called a poor focus group. After seeing the poorly acted script, the consumers were asked if they liked the possible ad. What could be done to improve it? Was it a success? This was so basic, and pointless. It was easy for an agency to use their favourite in-house tester, and the results of pre-testing could be manipulated in any direction. The majority of current pre-testing criticisms are outdated and refer to an earlier time when testing was a crapper.
Pre-testing methods today rely heavily on representative panels that have been pre-selected, animatics delivered digitally and metrics with much higher accuracy. System1, which is proving to be the leader in pre-testing, and has done so within a very short time frame, offers consistency and simplicity for a large number of customers who are using its services to refine and test their advertising.
It is possible to pre-test any draft of advertising within hours and not even days. The client will receive a predicted score for the ad. The spike score predicts short-term impact on sales of the advertising campaign. Star score is a more long-term predictor of how the campaign will impact brand building.
The small costs involved with pre-testing are a great deal when the amount of money that is spent on these advertisements.
Of course, the billion-dollar issue is whether or not these assessments can accurately predict the impact of the test campaign. Many marketers are annoyed when System1 and Kantar provide early assessments for Christmas advertisements or upcoming Super Bowl ads. Many marketers will ask, “Don’t you have to wait until the ads actually impact the market?” Can these predict the future?
Sky asked this question as it considered using System1 for pre-testing its advertising. Sky is one of Britain’s most successful advertisers and also one of its smartest. Sky was interested in hiring System1 because it could tell them which ads were going to work and not. However, Sky’s data-driven brand worried that System1’s seemingly simplistic approach might leave room for errors. Team members were skeptical that System1 could really pre-test ads with this predictive power.
Sky tested its testers, as every marketer would do. Sky already possessed advanced econometric information on past campaigns dating back over four years. This data was used to review the effectiveness of each of their ads, and how they affected its business. System1 was asked to review its archives and evaluate the impact of all Sky’s ads, both long and short term. Then, it compared the results of each Sky ad with what System1 had predicted.
The results were amazing. System1 was incredibly accurate in its predictions, both for short-term impact on sales and long-term branding. Sky’s econometric data and System1 predictions were correlated almost perfectly when Sky multiplied the System1 forecast by the media budget Sky spent on each advertisement.
Sky pre-tests 400 ads a year. Sky can produce better, consistent ads with this process. This process also helps to reduce the painful tension within the organisation about what the best advertisement is. The process is accelerated and the company can focus their attention on the winner, such as the Christmas advert. The small costs involved with pre-testing are a great deal when the amount of money spent on these advertisements is huge. Sky’s Ben Case says: “Spending thousands of dollars on pre-testing to ensure that the millions we spend on media works harder, pays for itself multiple times.”
Pre-testing does not only involve selecting a winner. The goal is to improve the winner and increase its effectiveness. Sky uses System1 as do many of its clients to enhance their ads during development. What to do with distinctive brand assets. What audio should I use? Cut away when necessary. When to let the shot hang.
Advertising: What is the purpose?
Every agency creative who has read this article is gripping their hands in rage. You can bet they are yelling about disruption, creativity, the need to stand out, and testing’s fallacy. So be it. Advertising will never be free of a special, artistic perspective that interferes with its primary purpose to benefit the business. There are times when a “black-swan” ad performs poorly in testing but then proves to be incredibly successful.
There are also 999 examples of ads that failed to test well, but were later improved by pre-testing. The clients aren’t in business to take risks in order to avoid creative failure. At least, those that didn’t come from an advertising agency.
Remember that ads that are a part of Campaign’s ‘turkey-of-the week’ usually do better than average ads. The winners of Cannes’s various awards for creative excellence usually perform worse than an average group of ads. Pre-testing is not effective (it is), and agencies claim to have a built-in radar that can predict creative success. They need to understand the market, just like everyone else in marketing. The producer, not the consumer. This paradigmatic divide can only be closed through research.
Ritson: Only marketers get sick of advertising, not consumers
Jon Evans, from System1, once told me an horror story. The client was about to make a major change in its brand’s campaign. In a meeting, he brought data on the previous company campaign as well as the proposed new campaign. He showed his team that the older campaign was superior to the one they planned to launch. The client finally accepted Evans’s argument and stuck with the old strategy. Both client and agency, however, were not happy about the savings and increased efficiency.
Why? Why? It’s time to shoot it. Get excited about it. Launch the project. Create a good film. The point was lost. It’s not to create ads, but rather to drive brand awareness and sales.
The pre-testing process has evolved. Now, it is a no-brainer for marketing. It is important that you do this. Agencies might push back. It’s not your money. Perhaps they would think otherwise if it was their money.