It happens all too often. A redesign or update is held up by indecision over which version will get you the positive results you’re looking for.
Should the chatbot appear on the left or the right? Do we want the CTA to be sticky or not? Will people like the hover effect on the button?
What if we told you that you can play around with how your webpages or apps appear with limited risk? That’s what A/B testing provides.
You can have a chance to see the future if you run your tests correctly.
What Is A/B Testing?
A/B testing, aka split testing, is the practice of comparing two versions of a webpage or app against one another to determine which performs better.
Think of it like a science experiment:
- Variables: Typically one to two webpage elements
- Hypothesis: How you think they’ll perform
- The actual experiment: Showing the different variants to users at random
- Analysis: Using metrics and data to determine which performed better
Running A/B tests can help take the guessing out of website optimization and give you real insight into changes you’re looking to make. Not only can you test variations, but you can collect actual data to back up your decisions.
It’s much easier to feel confident in adapting aspects of your business when you know it’s going to work out for you ahead of time.
A/B Testing Tips and Best Practices
So what are some ways to really get the best results? There’s a difference between a third grade science experiment and one run by the CDC after all.
We’ve compiled a list of A/B best practices for you to follow to make sure you’re getting accurate and actionable results.
Be Picky About What You Test.
It’s important for you to be selective when testing features on your website. You’ll want to target elements that make a real difference in generating and converting leads, like About Us pages and resource landing pages.
These bottom line items are going to be the things that create opportunities for your sales team on your website, so be sure to test the ones that truly matter.
Test One Element at a Time.
Or at the very most two.
This can be a pretty hard rule to follow when you have two very different ideas and want to roll them out in their completed forms. But be wary of throwing too many variables into the mix at once.
First off, it can muddle your analytics. If you have a lot of data rolling in at one time, it’s easier to get confused on what’s working and what isn’t and why.
Secondly, it’s harder to tell which element is actually making a difference in conversions when you have a lot of different features at play. What’s truly bringing in leads on the page? You’ll never know if you have six different options or CTAs live at once.
Pay Attention to Scale.
Sample size is the number of visitors that will see your two different versions.
If you don’t perform your A/B test on enough people, then you won’t be able to collect reliable data and pull accurate results. To avoid that risk, come up with an ideal sample size to reasonably estimate conversions from.
Remember to keep it to a reasonable scale. Don’t set extreme goals for yourself or else you might keep falling short and mess up your results.
Never Make Changes Mid-Test.
In the same way that you’ll ruin the integrity and tarnish the results of your school experiment (yes, we’re still going with that), you’ll do the same to your A/B test if you start tampering with things and making changes before it’s over.
If you interrupt the test, then none of the data you collect will be reliable. That’s a lot of time, talent, and resources expended for a now moot point.
Be patient and get to the finish line before you start another race. You can always test your new idea after you’ve finished the first.
Monitor Your Data Consistently.
Results can have a way of surprising us with sudden changes that we aren’t expecting. It’s all a part of the process.
Having unexpected results isn’t the problem. The problem appears when you can’t identify why the test ran the way it did. And, usually, that’s because you weren’t watching close enough.
It’s never a good idea to start a test and then walk away from it. When you come back to it hours or days later, you’ll have no idea how you got from Point A to Point B.
Monitor your data closely and regularly. We’re not recommending that you stare at your screen nonstop, watching incremental changes, but we do advise you to perform regular checks so that you’ll always know what’s going on and be able to catch sudden changes when they happen and determine why.
A/B Testing Examples
A/B testing has helped companies accomplish amazing things and provide even better experiences for their customers.
The proof is in the pudding when it comes to split testing. Here are some great examples of A/B testing done right:
Netflix
The streaming giant leverages a thorough A/B testing program regularly in order to deliver on a truly personalized experience for its customers. Their teams literally split test everything.
It’s primarily used for personalizing homepages. Users have embraced having a Netflix homepage that’s tailored and customized to their viewing habits and interests.
Netflix tinkers with how many rows go on a homepage, which movies or shows go into the rows, which thumbnails are shown, and so on and so forth, all based on streaming history and preferences. They test the different elements on different users and pull results based on activity and engagement.
That’s why your homepage might look pretty different from your friend’s.
Secret Escapes
Secret Escapes is a member-exclusive travel company that pulls most of its customers in with the promise of heavily discounted hotels and vacation packages.
They used A/B testing to test different variations of their mobile sign-up pages. Different form fields, button places, and CTAs were rolled out to customers in an effort to figure out which version would be the best.
As a result, they were able to double their conversion numbers and increase the value of each lifetime customer.
Booking.com
If you want to talk about a company that truly believes in the power of A/B testing, we have to mention Booking.com.
The travel company has hundreds of tests running on their website at any given time, especially when it comes to their copy.
Booking.com employees are encouraged to run tests on ideas that they feel can help the business grow and bring more value to customers.
When they partnered with Outbrain in 2017, they rolled out three different versions of landing pages to try and figure out why customers were falling off during sign up. With variations in copy, social proof, and awards, they were able to see a 25 percent increase in user registration.
A/B Testing Software
The actual process of A/B testing can feel a bit daunting, or even overwhelming, if you’re not a web developer.
Thankfully, there are several tools out there that you can use to help you along. To name a few, we suggest checking out:
When it comes to tools and software, there isn’t a one-size-fits-all in digital marketing. But, it is possible to get close to a perfect match if you know where to look.
Different tools may offer different results, so be sure to do your research to find the A/B testing software that’s going to work well for you.
A/B testing is one of the best ways to optimize your website and get the most out of your design effort. A lot of the time, it’s trial and error and trying again.
But don’t let that discourage you. It’s all a part of the game, and can actually be pretty fun. Who wouldn’t like to see their latest idea backed by data and numbers, proving that they’re geniuses?
Digital & Social Articles on Business 2 Community
(30)