Sunday, February 25, 2007

A/B Testing E-Marketing Tips

Does the word "test" still make you tense up and think of your high school math teacher with the bad comb over? When it comes to e-marketing, testing takes on a whole new meaning. If you read Anne Holland's weekly email newsletter, you will typically hear her preaching to e-marketers to TEST, TEST, TEST... at least a few hundred times. OK, that's a bit of an exaggeration but if you read her stuff, you know what I mean. Just for fun I went to the home page and saw the article: How to Test Email Landing Pages (More Easily). :)

Testing in e-marketing is definitely important in terms of getting the most bang for your buck. You want to make sure that your message will resonate with those that see it and that it has been fully optimized. This can mean testing a new landing page to see what the overall conversion rate is or testing a webinar invite email to determine the open rate. There are some mistakes in A/B testing though that you need to look out for. Matt Beslkin over at the Ominiture blog points out a few of these in his recent post: The Dark Side of A/B Testing: Don't Make These Two Mistakes!.

A/B Testing Mistake #1: Only Test One Element at a Time

If you're sending out an email it's easy to say: "Let's test a different subject line and a different opening paragraph" as compared to the original email. This is a no-no. In A/B testing you can only provide accurate results if you test one element at a time. As Mark explains it "Changing more than one element in an A/B test makes it impossible to determine which change drove better performance and which did not." If you want to test multiple elements at the same time, you need to conduct multivariate (multivariable) testing. While this can be more difficult, it can save you time in trying to determine what is the key element that will help you reach your e-marketing goals.

A/B Testing Mistake #2: Not Going Deep Enough

Matt then goes on to explain that you may not be delving deep enough into your web stats to determine how well your tests are doing (this is completely understandable coming from an Omniture blog). He is correct that you really need to dive deeper into who is responding to your email newsletter or whitepaper download. Ask yourself the following question when determining the success and failure an A/B test: Who is actually responding to the test email or landing page? Once you ask the question, ask some more questions like the following (geared to B2B):
  • Is it someone in your marketing database that you already know?
  • What type of industry are they in?
  • Which business groups are they? - in B2B it could be engineering, accounting etc..
  • What business titles are responding? C-level executives, managers, Joe Schmo with a hotmail account looking for a free ipod.
  • Are the people who are responding more qualified?
In some cases, the response rate may be better with one test group but the quality of responses may be better in another test group in terms of bringing in more qualified leads that are in your sweet spot. In order to determine the true success of an A/B test, it's best to look beyond your open rates and pageviews. However, if you're just getting started in A/B testing, slow down a bit and read the following.

A/B Testing: Getting Started

If your just getting your feet wet, what I recommend for testing emails is to first test the subject line. It's the subject line that will along with the "from address" that will determine if your email get opened. You can test a longer vs. shorter subject line, test using your company name, add personalization, etc... Typically testing consists of a smaller number of your email list (say 10%) so you can first test these results and send the most successful email out to the majority of the group.

For landing pages, typically the offer and number of fields are the elements that I would test. For example, if you're running a PPC campaign, find out which whitepapers or case studies are getting the higher conversion rates.

While Anne Holland, is all over the need for testing, we all know that this can be time-consuming. It's best to start slowly and look for small wins and then convince others that testing really does improve performance and is worth the resources.

Chad

PS - Do you have any testing stories? nightmares? successes?

1 comment:

ShareThis

LinkWithin

Related Posts Widget for Blogs by LinkWithin