The Complete Guide to A/B Testing with Email

Kevin Payne
By
August 24, 2020 ·
The Complete Guide to A/B Testing with Email

With the right strategies and tactics, email marketing stands to give you an average 3,800% return on investment. That means for every $1 you spend, you can earn $38. And one of these “right strategies” that we know about email marketing is to A/B test your emails to create higher-converting email campaigns every time.

But how do you start A/B testing your email campaigns? What are the best practices? Or, if you’ve already dipped your toes into this process before, how do you improve your campaigns so that you’re able to keep A/B testing an essential part of your marketing strategy?

In this post, we’ll show you the exact steps to create an A/B test campaign for your email marketing strategies. And soon, you’ll be able to create better emails and improve your overall email marketing efforts.

Steps to Run Your First Email A/B Test

Ready to run your first email A/B test the right way?

Here are the steps you should follow.

1. Have a clear goal

A/B testing is intrinsically conducting an experiment on different ways to optimize your emails for the best results, depending on your email campaign’s goal. You can have several goals for your email marketing campaign, so it should follow that you’d have several goals for each of your email A/B tests.

If you’re new to email marketing, for example, the goals you might want to focus on are open rates and conversion rates. Email subscribers can’t convert from your emails if they don’t open your emails in the first place, so these two metrics would be an excellent place to start. Later on, you might want to start looking at ways to improve engagement.

Setting your goals, in the beginning, is one of the most critical steps to your A/B test campaign. Without a clear goal in mind, you won’t have anything to base your results on.

To set your A/B test goals, consider hitting a certain percentage for open rates and conversion rates – say 25% open rates and 2% conversion rates.

You can also ask yourself these questions to help you set better goals for the A/B test:

  • Why do we want to test these specific variables or elements?
  • What are the insights we want to glean from this test?
  • How does the variable we want to test affect the performance of this email campaign?

2. Identify your benchmarks

You’ll need to determine your test benchmarks to get useful data from each campaign. The best place to start is by looking at your average email performance. What are your current numbers from previous emails?

Be sure to pay attention to both the highest and lowest numbers, as these can also help you set essential benchmarks and develop your hypothesis, as we’ll see in the next section.

You’ll also want to review the industry average in your niche. Depending on the kind of business you run, your average email open rates and conversion may vary. For example, some industries may have an average of 30% open rates, while others will be much lower or even much higher.

When you’ve determined your benchmarks, you’re also able to double-check your goals. Ask yourself if you’re setting realistic goals based on your existing campaign performance, and whether you’ll be testing the right variables to hit those campaign goals.

3. Set your hypothesis

As a best practice, you’ll want to test one variable at a time. When you know what variables are possible to test (and which variables you ought to test), you can create a hypothesis to go back to at the end of the experiment.

Say you want to test your email subject lines for more opens. Your hypothesis could be “Subject lines with questions get more opens” or “Subject lines with more than one letter capitalized get more opens.”

Remember that the point of setting a hypothesis isn’t to be right. It’s to see whether your initial assumption is right.

4. Determine your sample size

Aim to conduct A/B tests with as big a sample size as possible. You will need a large sample size in order to get the most data – and more data means more accurate insights.

The bare minimum would be a sample size of 100 people, with a 50-50 split for each test campaign. You don’t want to test on your entire email list: just big enough to get insightful data, but small enough so that you can send the winning campaign to the rest of the people on your list.

5. Make sure that you have the right tools in place

Your chosen A/B testing tools can make or break your campaign, so be sure you’re investing in the right ones.

Look out for even popular email marketing services that don’t have robust A/B testing features. A few may claim to support A/B testing campaigns, but you may find you can only test one or two variables, like a subject line or pre-header.

As a best practice in your overall marketing strategy, you also ought to A/B test different variables and assets on your website or ecommerce store. Together with a robust email marketing A/B test campaign, you can get even better results and improve on all future campaigns from any touchpoint.

Convert Experiences integrates with most email marketing tools — is the one you use on the list?

6. Choose which variable to test

We mentioned before that it’s best to test one variable at a time. This way, you can easily go back to your hypothesis and check whether your assumption was right.

Here’s a list of things you can test in your emails:

  • Subject lines. For example, Mailchimp found that best-performing subject lines had more than one word capitalized, among other findings.
  • Templates and email length.
  • Call-to-actions. Links and buttons that are easier to spot, for example, are more likely to get clicks.
  • Design elements
  • Time and day of sending
A/B testing variations
Select one variable to test. In this example, the test is for the email hero image. (Image source: Weebly

7. Create your variations

Now you can proceed to create your email variations to test. Using your hypothesis, create two variations of your chosen variable. Your first variation will be considered your control, while your second variation is your test.

So say you were testing different email lengths. You might want to experiment on a short copy versus a longer, more detailed copy.

A/B testing email short copy variation
Example of short email copy variation. (Image source: Blue Tree)
A/B testing email long copy variation
Example of longer email copy variation. (Image source: Blue Tree)

In the above example, our hypothesis may have been that shorter emails get more replies. But our test email in the second photo will test whether a longer email that provides more detail is more desirable to our recipient and gets us more replies than our control email.

8. Run your A/B test

Once you’re satisfied with your variations, it’s time to run and send your A/B test. Because you may experience a few delays between when you send your emails to when people open them, you’ll want to give a few hours allowance before concluding your test.

Zapier found that email effectiveness tends to dwindle after four to five days, but many email providers have a built-in time frame for A/B testing your emails.

9. Analyze against your hypothesis

After you’ve determined the winning variation, it’s time to glean insights about the campaign. Analyze against your initial hypothesis and see whether your assumption was correct.

Pay attention to which metrics may have improved and which elements didn’t work. Also, review previous email campaigns to check for any patterns that may point to better-performing emails and even low-performing ones.

10. Repeat for the rest of the sections of the email

As a best practice, you’ll want to test different variables whenever possible. Experiment with testing different variables per campaign in order to get the most information and data about your audience’s preferences.

You can then use this data to create more informed campaigns, later testing variables you haven’t experimented with yet. So if you’ve found a general subject line format or length that performs best for your business, for example, you can move on to testing email headers or CTAs.

Key Takeaways

A/B testing requires putting on your scientist cap and experimenting based on a given assumption. Use this guide to help you A/B test your email campaigns to get better conversions – whether your goal is more opens, clicks, or replies.

Get a Taste of One of the Most Privacy Aware A/B Testing Tools Out There
Get a Taste of One of the Most Privacy Aware A/B Testing Tools Out There
Originally published August 24, 2020 - Updated November 10, 2022

Mobile reading?

Scan this QR code and take this blog with you, wherever you go.

Authors
Kevin Payne
Kevin Payne

Kevin Payne is a content marketing consultant that helps software companies build marketing funnels

Editors
Carmen Apostu
Carmen Apostu

In her role as Head of Content at Convert, Carmen is dedicated to delivering top-notch content that people can’t help but read through. Connect with Carmen on LinkedIn for any inquiries or requests.

Start Your 15-Day Free Trial Right Now.
No Credit Card Required

You can always change your preferences later.
You're Almost Done.
I manage a marketing team
I manage a tech team
I research and/or hypothesize experiments
I code & QA experiments
Convert is committed to protecting your privacy.

Important. Please Read.

  • Check your inbox for the password to Convert’s trial account.
  • Log in using the link provided in that email.

This sign up flow is built for maximum security. You’re worth it!