시험과 실험의 차이 CREATIVE: TEST VS EXPERIMENT
CREATIVE: TEST VS EXPERIMENT
Here we go again. Yes, thanks to the game changer that is digital, all that you’ve known and loved as a marketer is morphing before your eyes. The way you research creative isn’t standing still either.
affinity.ad
시험과 실험의 차이 이 두 단어 차이는 '행위' 여부 달려있다. 시험 Test 試驗 주로 행위를 뜻하는 명사 앞에 붙어 시험 삼아 무엇을 해 볼 때 (시험 비행.시험 운전), 실험 Experiment 實驗 행위를 뜻하지 않는 명사 앞에 붙어 어떤 현상을 조사하거나 새로운 방법을 사용해 볼 때(실험 동물.실험 소설) 쓰인다. 북한 핵 시험 (발사)/rfa 사례 북한의 (핵) 실험 북한 핵 시험 (발사) 여기서 핵은 움직임이 아닌 단순히 핵의 의미다. 발사는 행동의 뜻으로 앞에 시험이 와야 한다. 황기철 콘페이퍼 에디터 큐레이터 Ki Cheol Hwang, conpaper editor, curator |
edited by kcontents
Here we go again. Yes, thanks to the game changer that is digital, all that you’ve known and loved as a marketer is morphing before your eyes. The way you research creative isn’t standing still either.
Smart creative agencies are pushing advertisers to ‘experiment’ instead of ‘test’.
If you don’t think there’s a difference, read on.
Most of you will be fairly familiar with the tried and trusted creative test.
For decades the test approach has made many a decision for advertisers. Take a group of baseline Australians gathered in a room, with cameras, brand managers and agency folk behind a one-way mirror. Show them your creative mock-ups and carefully scrutinise their reactions. Based on how those reactions are interpreted, the winning idea is selected to push out your multi-channel campaign and cross fingers it works.
Aside from fundamentally disagreeing with this methodology, we believe if you’re forced to test, testing strategy, not concepts yields better insight. Any agency could reel off a bunch of potential great ideas effectively killed or diluted to blandness by this delegation of choice. If you haven’t read Alan Hedges’ Testing to Destruction, read it. A seminal and prophetic critique on the mis-use of research and advertising. A post written 11 years ago by Andy Nairn summarises its thesis elegantly.
So, 41 years since Hedge’s predictions, we have so many more options and traditional creative testing becomes so less relevant to today’s advertiser.
In a world of smaller budgets, fit-for-purpose low-fi creative can be deployed more strategically and agilely through more targeted media channels. So much so that going to the expense of testing your ad the old-fashioned way might well cost more than you spend on getting your message to market. Not only that, you’re going to get a lot more value, and faster too.
If you had two strong contenders you wanted to test, rather than choosing one over the other you could experiment with both instead. Using like for like digital executions you can test real creative performance in situ or in the real world instead of in vitro like a lab test. For example, one might give you a 30% higher engagement rate, and the other delivers a 20% better cost per acquisition.
By comparison, your researcher simply can’t give you hard metrics of how your message will impact on actual behaviour. So for the investment in your panels or groups, you need to question what actionable insight have you received.
SO WHAT’S THE SOLUTION?
Experiment. Get both ideas out there and see what works.
And here’s the kicker. Your experiment will be growing your brand and business whilst you gain insight.
Regardless of the size of your campaign, money spent on testing creative can’t earn you a single sale. Experimenting in-market will.
Even if one of your messages performs worse than the other, it’s still out there getting you clicks, making your investment in experimental market research turn out a measurable ROI, with performance insights and a healthy dividend.
Better still, instead of measuring stated intended behaviour, you’re measuring real-world actual behaviour. What’s more important? What the most dominant people in a group say they think about an ad, or what people actually do when they experience the message in context?
SO HOW DO YOU DO IT?
Regardless of whether you are running in traditional or digital channels here are some tips for your to consider.
TEST & CONTROL
Dust off your 7th grade science textbook and follow the scientific method.
Split off three regions, use the first two to test your different creative executions, and the third as your control group. Minimise all variables except for the different creative itself. Run your activity for a week and monitor the response.
Depending on your product this should give you all the data you need to select the most effective messaging, as well as providing you with benchmarks to apply to the broader launch of your campaign.
MULTI VARIANT TESTING
If digital channels are your core, you’ve got it even easier. You can split up your audience in platform into as many groups as you like and run as many concepts as you have.
Got six great concepts? Great, run them all and optimise your campaigns to feature the best two.
SEQUENTIAL PATH TESTING
At the end of the day, very few people are kicked into action from seeing a message once.
In reality you might have a range of messages to reach them as they progress closer to making a purchase (or whatever your desired behavioural outcome is).
Whether you’re identifying where they sit in the path to purchase by the searches they make or the websites they visit, your messaging needs to be relevant and
timely to get the response you want.
If you tailor messages to specific sites, you’re much more likely to build personalised connections. But the story you have to tell may need more nuance and be told over a longer period of time. This is when you should consider using a sequence of messages, which of course ups the ante in terms of complexity, but can still be the subject of experimentation.
Just like Multi Variant Testing a single message, you can, and should multi variant test a series of sequences to find the most effective order of conversation between you and your customers.
SUMMING UP
As budgets come under more scrutiny, but with so much more control over targeting your media spend, investing in testing creative is an unnecessary cost with little chance of achieving ROI.
Save smart qualitative research for up-front strategy, language and other competitive market insights: not creative.
Regardless of what channels you advertise in, establishing in-market performance benchmarks are a much more valuable aid in creative decision making than feedback from a panel or focus group.
https://www.affinity.ad/blog/creative-test-vs-experiment
kcontents