Is A/B Testing Dead?
The last few years have seen a huge trend in the industry toward machine learning, with Google consistently developing its algorithms to work independently across a broader range of tasks in the Google Ads world. Do we really trust them to act in a way that’s best for both campaign performance, and retain a luxury brand image? In this article, we discuss Google’s latest efforts to modernise A/B testing and, in particular, how manual ad copy testing could become a thing of the past.
We’ve always been a fan of A/B tests when running ad campaigns online – it’s how we create order from chaos when trying to achieve the best results for a budget. With the want to find the most effective copy, the most relevant landing page, and the perfect targeting, simple tests need to be run along the way.
There is always something to try when running paid ad campaigns, but certain things must be taken into consideration. Examples include:
- Which element of my campaigns do I test first?
- How long do I run a test for before making a conclusion?
- How do I trust the data, and know that other unrelated factors are not the determinant in performance?
It is difficult to know the answer to these questions, and so we follow some simple rules that reduce the risk of oversight or being misled by the data. We might, for example, choose to test just one element of ad copy at a time. If the only difference between three ad copy variations running simultaneously against the same set of keywords is the description line, we can safely assume any significant differences in click-through-rate are down to the effectiveness of this text.
As marketers we like to maintain a certain level of control, though as the platforms we use become more sophisticated, we must decide whether to relinquish some of this control for the chance to achieve better results and faster. Below, we introduce two of Google’s latest machine-learning techniques that may lead to the death of manual A/B testing as we know it.
Responsive Search Ads
The ad copy you choose can have a huge effect on performance, but a brand will always have considerations outside of this. This written text is a representation of a brand, matching the tone of voice and character you might expect in-store staff to demonstrate. If the product is a diamond engagement ring, there’s a chance that offering a promo alongside a hard call-to-action may result in an increased number of clicks. As fine jewellers, however, romantic messaging that attracts a high net worth customer to visit a physical store may be best for business.
We can make these assumptions, though in practice we can often be surprised at the results. With ads on Google currently, our best course of action is to test these against one another and let the data speak for itself. For the above example, our two ads might look as follows:
These ads may run against each other for a period, at which point we analyse which generated better results in line with a set goal. From that winner, we generate a new test, perhaps this time with a series of headlines.
Responsive Ads for Search take this kind of copy testing a step further, by allowing you to provide the elements of copy in bulk that you’d like to test and using machine learning to test and determine the winner without your intervention.
There’s incentive to try this too – instead of our usual pair of 30 character headlines and single 80 character description line, Google will show a significantly larger ad made up of the copy you’ve provided. In total, three 30 character headlines can show simultaneously, as well as two 90 character description lines. This is almost double the amount of copy space available to promote a service or product on the search results page:
To achieve this, up to 4 description lines and 15 headlines are given to the platform for testing, with Google trying different combinations of these and optimising on a continuous basis. There’s some security here – you can choose to pin up to two headlines in the first, second or third position. You might wish to do this because you want the brand name to always be present, or because your industry requires you to state some terms to be compliant with the law.
Sounds great, right? Take your time to consider the pros and cons here. On one hand is the potential to find top performing copy in the shortest amount of time and with minimal effort. On the other is a lack of control over which elements of copy appear together, and to who. Does this make sense for your brand message? We think it’s worth a test.
One step further than this is Google’s auto-enrolment of Ad Variations. These are new copy variations, written entirely through machine learning, and automatically put live in your account. Don’t panic! This option can be turned off – though is on by default.
How comfortable do you feel with a piece of copy being shown to customers, without any human input in the writing process? Most luxury brands will say this is too risky and could easily turn into a brand disaster with just one rogue screenshot to Twitter.
Consider though, that every element within this new piece of copy has been hand selected by a sophisticated algorithm, based on performance. They may even highlight some smart wording or phrases that never crossed your mind as if a copywriter with all the data in the world has gotten hold of your Adwords account. These should act to improve the quality score of your ads, generate more clicks, and for a cheaper cost – what could be better?
In this instance, you must weigh up your priorities – do you value short-term performance over long-term brand perception? Is this suited to a brand of your size, or do you have the capacity to keep up with copy refreshes? The jury is still out on this kind of algorithm, and it remains to be seen if Ad Variations can capture the essence of a brand without using the entire character limit on crowd-pleasers.
What should I do?
The trend in the industry is clear. Automation is here to stay, and it’s becoming more and more prevalent. What is also clear, however, is that they’re not faultless. A sharp eye is necessary when leaving Google to run these kinds of tests for you, to ensure there’s nothing going on that any brand manager wouldn’t approve.
We needn’t assume that automation can outthink a human in every case, but we should understand that algorithms become necessary to keep up with the pace and precision of testing in 2018. Google can begin to optimise the above A/B tests from launch, while a manual approach may require many days or weeks of data to be confident in one change. By that stage, Google may well have shortlisted a handful of ad variations, generating twice the CTR of your original batch.
The key to using automated A/B tests to your advantage lies in one point: provide the best possible content initially for the algorithm to work with. If every piece of copy submitted is on brand, and keywords and landing pages are relevant and up-to-date, there’s no reason for Google to do anything but help you. Try some of these methods out, monitoring frequency, and see how it works out for you. Just don’t wait too long, or your competitors will beat you to a 180-character description before you know it!
If you need advice with testing your performance ads, please let us know. We would love to help!