Why You Should Question Conversion Rate Optimization Best Practices

Neil Patel
Neil Patel

Updated:

Published:

I wrote this article out of a serious concern for conversion optimizers all around the world. There’s no doubt in my mind that the conversion optimization industry is growing. As it does, I want to combat the myriad of myths that are swirling around.

One of the most toxic myths is that of CRO best practices. I contend that there is no such thing as one-size-fits-all best practices in conversion rate optimization.

Try out our free marketing tool that helps you test out various types of popup CTAs and gives you information about your site visitors. 

To show you what I mean, let me take you on a short journey. We’re going to look over the shoulder of two conversion optimizers at work, and see what they’re doing.

A True Tale of Two Conversion Optimizers

Conversion Optimizer #1: I need to add trust signals!

Let’s walk into the office of “Fred,” a very capable optimizer. Fred wants to increase sign ups on this page:

noverisign

Image source

He thinks that the padlock symbol is good, but he wants to make the trust emblem even more trustworthy.

Fred is very interested in conversion optimization best practices. He tries to learn all he can about the best ways to optimize his website. Fred once read online that he should add trust signals like badges to forms. What’s more, he read that the VeriSign badge has the highest level of trust.

Fred runs a test between version A (above) and version B (below). In keeping with best practices, he chooses the VeriSign symbol to test on version B. He puts it on the signup form in place of the padlock.

Here is Fred’s new signup form -- version B.

verisign

So, what happens? As Fred hypothesized, version B won. Adding the seal caused a 42% increase in sales.

Fred sat back in his comfortable office chair and smiled. “Conversion optimization best practices are really great,” he said with satisfaction.

The end.

Conversion Optimizer #2: I need to add trust signals, too!

"Adam" is our second conversion optimizer. Like Fred, he loves to follow conversion optimization best practices. He read about Fred’s success and decided to test two versions of a signup form.

He put a TRUSTe badge on Version A, and left it out on Version B.

Adam sat back in his chair, daydreaming about the uptick in signups. He thought about the pay raise, the promotion. Yes, life was good.

But when the test results came in, Version B -- the blank one -- had higher results! Adding the trust badge caused signups to decline!

It’s a good thing he tested it.

Adam, in disbelief, spent the remainder of the day on BuzzFeed trying to cope with the disappointment.

The end.

The Moral of the Story

I’ve written this brief account in order to make a single point: Conversion optimization best practices don’t work.

I see this kind of confusion frequently. Someone reads about a great A/B test, gets excited, and rushes out to do just the same thing.

To make matters worse, many site owners don’t even test. The read a “best practice” and go out and do likewise. They hope that indiscriminate site improvements will magically cause their conversion rates to improve. But behold, rather than improving their conversions, the “best practices” bring poor results.

Reading about best practices, tests that won, and A/B testing success stories is great. Testing stuff on your site is great. However, blindly following best practices or believing that you’ll experience the same results can do more harm than good.

Why People Assume Best Practices Are Best

Why are we so easily deluded by the “best practices?”

  • Because they are called “best practices.” One reason we fall into the best practice delusion is because of the name: “best practices.” Just because someone calls something “best” doesn’t mean it always is.
  • Because we’re lazy. Testing is hard, confusing, and time-consuming. Instead of doing the hard work of split testing our websites, we rely on the tests of other people -- successful people, mind you -- and hope that we’ll get the same results. Fingers crossed.
  • Because it’s been tested and proven! A test is a test! A statistic is a statistic, right? What could go wrong? Numbers don’t lie. Right? Right?! Ah, but this misses the point entirely. Whose numbers are you talking about? Your numbers or some other random numbers from some other random test?

All the best practices statistics, percentages, and upticks blind us to a very simple fact: That’s not your website. Just because something worked on one site does not mean it will produce the same results on yours. Everything is different -- the industry, the season, the customer, the niche, the conversion action, the product, the website, the design, the color scheme, the price, the offer, the headline, the pitch, the ... everything!

You can’t simply transfer best practices from one website, industry, or test into your own website. Sorry.

Pulling Back the Curtain on Best Practices

What I’m going to do next is an exposé of conversion optimization best practices. You’ve read the best practices. You’ve heard the success stories. You’ve oohed and ahhed at the tests -- but just because the results look exciting doesn't mean the tests were run properly.

Conversion optimizers often pour on the statistics, but what many readers don't realize is that their tests might be broken. Here are common mistakes that even CRO experts make.

  • The test has no baseline data. Often, an optimizer leaps into a frenzy of A/B testing without doing any A/A testing. For more information on A/A testing, please see “Lesson 2” in my story about conversion optimization.
  • The test runs during a bad testing period. Every business has peak seasons and not-so-peak seasons. If you test during the wrong period, then you’ll gain data that is unreliable.
  • The test fails to account for existing data and customer information. The very best conversion optimizers are good at what they do because they understand their customers. Knowing the underlying customer journey is a powerful way to choose the best things to test. In addition, when you understand your customers’ journey, you are better able to determine a helpful starting hypothesis.
  • The test measures too many variables. Overeager CROs often create several changes to a website, and then run a test to see what happens. This is not an accurate way to do split testing. What happens as a result is that you don’t know which of the variables is responsible for the increase in conversions. (If you want to do a multivariate test, go ahead.)
  • The test has no underlying hypothesis. Michael Aagaard discovered the hard way that you need to test hypotheses. His one-line statement sums it up: “Your A/B test is only as good as your hypothesis.” Unfortunately, many optimizers run tests simply for the sake of testing. When you test without a hypothesis, you fail to surface actionable results.
  • The test may not have statistical significance. Just because someone tested something doesn’t mean that their test is reliable. If a test doesn’t have statistical significance, then you can’t trust its results.
  • The test may not be valid. Often a test produces an imaginary lift due to early ending or because the test reached supposed statistical significance.
  • The test does not account for segmentation. If a split test doesn’t factor the impact of segmentation, then the test could be totally off. Segmentation takes into consideration the different variations among a group of users -- geography, demographics, socioeconomics, behavior, etc. If you test generically, you’ll get generic results.

The list above is just a sample of all the things that could screw up a test.

As a result, this scenario is played out:

  • Someone runs a screwed up test, committing one of the errors listed above.
  • They blog about it. They points to the fact that they got a “498% increase in conversion rate!” because of the change and calls it a “best practice.”
  • You desire a 498% increase in conversion rates, too.
  • You follow the example of the best practice.
  • Your conversion rate nosedives.

Many of the best practices aren’t best practices after all. Here are some of those best practices that have been disproved:

  • Best practice: Social proof always works. Truth: Not always.
  • Best practice: Short landing pages are better. Truth: Not always.
  • Best practice: Follow the three-click rule. Truth: Not always.
  • Best practice: Testimonials will improve your conversions. Truth: Not always.
  • Best practice: Coupon codes will increase conversions. Truth: Not always.

For every so-called “best practice” there are probably a handful of people who will raise their hand to say that it’s not true. The bottom line is this: You can’t take best practices at face value.

So, What Should You Do?

All the conversion optimization best practices are crumbling before your very eyes. What do you have to hold on to? Here are five things that you can do to replace the inaccurate information that you’ve been following.

  1. Learn your own CRO best practices. Once you realize that there are no universal best practices, you discover that there are only best practices for your site. Even then, you have to be careful. A “best practice” that holds true one year may be overturned the next.
  2. Test the right stuff first. If you can identify the most important tests to run, you’ll save yourself a lot of grief. Plus, you’ll be able to move the needle on conversion rate improvements a lot faster.
  3. Test everything. The more types of tests you run, the better you’ll be able to gain actionable insights. Test everything possible.
  4. Test against your assumptions. Successful optimizers learn to second guess their assumptions. Then, they test their assumptions to see if they’re accurate or not. Often, the very things we think are correct are actually roadblocks to success.  
  5. Test consistently. Create a roadmap for your split testing. If you can sketch out a split testing map, it will keep you from missing obvious stuff.

Conclusion

Once you realize that there aren’t any best practices, it frees you up to try your own thing, to test intentionally, and to make changes that are best for your customers. Nothing -- not even the most widely held best practices -- can substitute for deep knowledge of your users and accurate testing of their behavior.

What conversion “best practices” have you debunked?

New call-to-action

 

 New call-to-action

Get Started with HubSpot's Lead Management Software for Free

GET STARTED

Marketing software that helps you drive revenue, save time and resources, and measure and optimize your investments — all on one easy-to-use platform

START FREE OR GET A DEMO