The Importance of A/B Testing Sales Emails

Published: January 11, 2019

Pavel DmitrievWe all want to make things better. But how do you know if your efforts really make a difference? How do you know if the changes you make in a system have a positive impact?

Acting on a hunch, basing a solution on guesswork, or leaving success to luck certainly has a random chance to actually work out in the end. But it can also lead to nowhere or even cause everything to come crashing down instead in total failure. Would you take a chance if something very important is on the line?

Of course not. Besides, there’s a far better way.

Data — lots of it — is the answer. And science — not speculation — will deliver the outcomes you desire. In particular, you need to perform A/B testing to rule out the guesswork and conclusively verify what works and what doesn’t.

Get the latest B2B Marketing News & Trends delivered directly to your inbox!

A/B testing (or split-run testing) is a randomized experiment that determines which of two variants is better. You can use it to compare a wide range of stuff — from website navigation to mobile ads. For emails, you can use A/B testing to compare subject lines, opening sentences, email body content, message length, ask phrasing and other elements.

What makes A/B testing awesome is that it allows businesses to make decisions based on customers’ real behavioral data, instead of well-entrenched and widely-held market assumptions that may be true, false, or simply outdated.

As a data scientist, I think business leaders — especially those in sales and marketing — should recognize and leverage the game-changing potential of this technique. Many of the most successful technology companies — such as Google, Microsoft and Netflix — have been doing so for years. In fact, A/B testing is pretty much the preferred method (in fields such as software development and behavioral economics) for evaluating every single idea or product feature.

How A/B Testing Impacts Email Marketing Performance

Given the preponderance and wide acceptance of videos, would including one in an email result to an uptick in reply rates?

Common sense and predisposition might nudge you to say yes. After all, shiny objects certainly turn heads and make you want to take a second look.

Imagine this scenario: a tech-savvy marketer started using a catchy video in her email, got a few responses and told her friends about it. Soon, everyone on the team and their peers over at sales dev also began using videos. It might be fun spicing up your email communications but if you’re clueless on whether it improves your numbers, aren’t you just wasting time and effort that you can instead use for something more valuable?

We explored this idea by A/B testing two follow-up email templates. With everything else being almost identical, one template had a video link and the other didn’t. We used Outreach’s automated sequence feature to ensure randomness and eliminate human bias. Guess what we’ve discovered: in terms of response rate, the template without a video performed nearly 100% better than the one everyone thought would be the winner. Statistical analysis verified that the result was statistically significant at 99% confidence level — that is, in this specific scenario, contrary to everyone’s gut feeling, the email without a video performed better.

Here, however, is where many sales leaders may fall into a trap. Does this result mean sales teams should ditch using videos in all their emails? Definitely not!

Caution: Generalizing A/B Test Results

The A/B test refers only to the specific video, email template and recipient pool/type used in the experiment. There’s always the temptation to make sweeping generalizations but that’s a no-no in A/B testing. Never extend the implications of outcomes to other contexts or situations unless you also have the right data backing your claim.

What you could do instead is use the result of the test to generate more hypotheses, e.g. that perhaps videos used in your other email templates do not work either, and then conduct new A/B tests to verify whether these hypotheses hold true. By running such tests, you will form a comprehensive understanding of when videos work, when they do not and how they need to be formatted, developing a much more effective way of utilizing videos in your email campaigns.

How To Do A/B Tests The Right Way

To generate precise insights from A/B tests, here are a few things to consider:

  • Be hyper-specific when setting up the test parameters.
  • Identify the specific variable you need to test. Avoid testing more than one variable or element at the same time.
  • Decide on the specific metric (e.g., open rates, response rates, etc.) with which to assess the success of the test.
  • Determine the period required to run the test.
  • Ensure that the two variants being compared are identical in every respect except for the specific element, feature, or characteristic being measured.
  • Conduct the test on a sufficient volume of data. Remember, the smaller the sample gets, the less accurate and conclusive your findings become. You can’t make a confident decision based on a sampling of just 20 or so emails.
  • Ensure the prospects are assigned to each variant randomly. Automated randomization is recommended to avoid human bias.

The most successful organizations do not just innovate. They correctly identify which innovations actually work for their audiences. A/B testing is among the most powerful methods to do so. For sales and marketing teams, it’s a must-use tool to optimize performance.

ADVERTISEMENT
ADVERTISEMENT
B2B Marketing Exchange
B2B Marketing Exchange East
Campaign Optimization Series
Buyer Insights & Intelligence Series
Strategy & Planning Series