My first post-college job, I was lucky to be surrounded by marketers that tested everything: emails, landing pages, site design, etc. These were true data-driven marketers. For them, gone were the days when they relied on gut feelings to create and optimize marketing campaigns. I wanted to be just like them and let data do the talking.
What is A/B testing? (aka split testing)
Smashing Magazine summed this up well in their Ultimate Guide to A/B Testing:
“You have two versions of an element (A and B) and a metric that defines success. To determine which version is better, you subject both versions to experimentation simultaneously. In the end, you measure which version was more successful and select that version for real-world use.”
If you are a visual person, here is a lovely sketch from my notebook. It shows what a basic subject line test might look like. Please forgive my (lack of) drawing skills:
Also, please note that A/B testing is not the same thing a multivariate testing. A/B testing involves changing one variable at at time, while multivariate testing entails changing several variables at once. A multivariate test can be great too but you would need a lot more traffic for the results to be statistically significant. We’ll keep things simple for now.
Many marketing campaigns are based on a hunch
Should our call-to-action button be orange or green? What is the best day of the week to send a newsletter? Is it a good idea to use “FREE” in the subject line? Should the subject line describe the contents of the email, or be fun and quirky? Should the email sender be an individual team member, or a generic company email? There are endless opportunities for testing, so here are several marketing tests and results to inspire you.
When it comes to marketing decisions, it’s best to rely on data, not assumptions. You may have a hunch that the prettiest landing page will get the most leads (and dollars) but you could very well be wrong. Have multiple variations in mind for a campaign? (ahem: of course you do). Now, A/B test it!
How to get started with A/B testing
First, check out this A/B Testing for Marketing Optimization Guide by HubSpot. Their crew is one of the best when it comes to content marketing and they put out helpful learning tools with actionable information.
Once you are ready to get started, email subject line testing is good way to get your feet wet. It sure was in my case, at least. There are various subject line testing scenarios, but this was the extent of my split tests for the first month or so:
- Finalize copy for an email.
- Come up with three different subject lines (every other variable stays exactly the same).
- Send the three variations to a sample of 30% of subscribers. 10% gets subject A, 10% gets subject B, and 10% gets subject C.
- After 16 hours, pick a winner based on whatever defines “success.” This could be open rate, click-through-rate, conversions, or dollars. It depends on what you are trying to achieve for that specific email.
- Winning email goes out to the remaining 70% of subscribers.
A/B testing your next email
I have first-hand experience sending emails and conducting split testing through Marketo, MailChimp, and SendGrid. Luckily, all three automated the testing process, automatically sending the “winning” email (based on my criteria) after the test-run was complete. Whatever email marketing tool you use, check out if A/B testing is a built-in feature. If not, it may be time to re-evaluate the way you manage email marketing.
Test entire site elements
Once you get a little more comfortable with the idea of A/B testing the small stuff you can experiment with testing entire site elements. Instead of changing up button colors or subject lines, you can test two radically different pages against one another. In this case, the two designs would be variables. This kind of testing can yield big improvements so it’s worth looking into split testing tools that can handle these kinds of experiments, before you get too caught up focusing on only small tweaks.
Now that you are convinced…
ResolutionMedia has a great step-by-step guide to split testing. Many A/B testing tools automate the process, but it’s still important to understand how to pick a sample size and acceptable confidence level, how long to run a test, what the criteria for picking a winner will be (and why), and whether the results are statistically significant or not.
Go forth my fellow data-driven marketers and get testing. Because not testing means losing out on leads and ultimately cash.