A/B Testing Subscription Forms and Landing Pages

by Patrick Cole June 14, 2016
June 14, 2016

A_B Testing Primary


I have a strict policy that some of my colleagues believe is more strident than need be. My policy is that I never go fully live on a change to a website without significant A/B testing. My feelings on this are exceptionally strong when it comes to changes on subscription forms and landing pages. No, it doesn’t make a difference to me if the change is seemingly insignificant.


Why am I so insistent about A/B testing? Because personal experience has shown me over and over again that there is nothing that works all of the time and there is nothing that is doomed to fail. What works spectacularly on a dozen landing pages might cause the thirteenth landing page to tank. The trick that you believe is sure to get more people to fill out your subscription forms might actually make your numbers drop.


The point is that rules in web design are great guidelines, but following them strictly without testing can lead to bad results. Here are a few cases I’ve been involved with that led to unexpected results and interesting insights.


Call to Action Buttons on Landing Pages – Green Doesn’t Always Mean Go


I don’t use the color green all of the time when it comes to CTA buttons, but I do use it quite frequently. This is because people associate words like go, yes, positive, affirmative, and best of all, money with the color green. I rarely use red for call to action buttons, unless I am giving the user the ability to opt in or opt out, then I will make the opt-in button green and the opt out red. It works well to communicate to the reader what their options are in a visual sense. Red was definitely not a color that I would choose for a singular opt-in button on a landing page. Even tests in my blog showed me that green was a better call-to-action color than red.


testing green light


However, a few years ago I was working on revamping a poorly performing website for an online store that dealt in refurbished cameras and accessories. One of the elements that I immediately recognized as being wrong was the call to action button. It was placed off to the left of the screen, the color was all wrong, even the text was confusing. So, I moved the button and got great results on that testing.


Then, I tried a few different things with regard to the text. It was hit and miss, but I finally landed on something that earned a small bump in the conversions. My next move was to experiment with the color. I started with green, and the results weren’t good at all. Looking back, I can see why. It just didn’t work with the colors and the overall design of the landing page.


At this point, I realized something. There was quite a bit of red in the logo. If I was only thinking about aesthetics, a red call to action button would look really nice. So, I went against experience and advice and tested the page with a red CTA button. A few days later when I had collected enough data to be useful, I found that using red increased conversions by 3 percent on the first day, and that number remained stable for the days after.


Takeaway: Don’t dismiss the role of aesthetics in getting customers to answer your calls to action.


An Unintended Consequence of Building Trust on Forms


There is a lot being written on the topic of trust as it applies to web design. The idea is that trust is more likely to result in conversions than simply boosting traffic to your landing pages. This is a very valid concept, and it is something that I follow very closely in my design practices. For the most part, I recommend that you should do the same.


On the other hand, this article is all about the ways in which A/B testing reveals things that we don’t expect. In this instance, I was building a page on a website that contained a simple email subscription form for premium content. The needed fields were email address, first and last name, and zip/postal code (there was some regional content involved). Of course, the marketing team I was working with wanted many other mandatory fields for data collection and future campaigns. Thankfully our first round of testing proved that making all of the extraneous fields mandatory was a real mistake. My suggested version performed better than the other form, but it still wasn’t where we wanted it to be.


This is when I remembered seeing a couple of Trustmark badges on their checkout pages. One was specific to the third party payment provider, but another was a privacy guaranteed badge from a security auditing firm. Thinking that a Trustmark regarding privacy might increase participation, I suggested adding that to the page. I was hoping conversions on the page would increase, and that people would also be more likely to fill in the fields that were no longer mandatory.


We tested my original page against the same page with a Trustmark added. The results weren’t even close. The page with the Trustmark tanked. So, off it went. Fortunately, we were able to apply a few small changes to get the results that we wanted.


What was so off-putting about the Trustmark? I’m not completely certain, but I do have a theory. This particular form was in a place that was only going to be accessed by customers who had made purchases in the past. So, they would have seen the Trustmark previously on the checkout page. My thought is that it may have made them believe that they would be charged for joining the list. After all, that was what their memories associated that badge with. It may be worth experimenting with trust badges on forms in the future.


Takeaway: Good A/B testing is key when it comes to providing data that goes beyond our assumptions.


Being Schooled By The Double Opt-in


I’ve been doing this for many years, but I have come to understand that the best teachers are sometimes those who are less experienced than me. A few years ago, I took on an intern from a local community college. She was a talented young woman who wanted to focus on graphic design, but asked to be placed in a web development/web design internship to build up some strengths and get some experience in this niche.


This was around the time that the double opt-in became popular for email subscription lists. Basically, you send a confirmation email after somebody subscribes to make doubly sure that they intended to do that. To be honest, I wasn’t sold on that idea at all. Why give somebody a chance to jump out of the funnel before they see even one marketing email? Isn’t that what the unsubscribe option is for?


testing opt-in


Anyway, that was my logic back then. Apparently my intern felt differently, because when I gave her the assignment of putting together a quick email subscription form for my personal website, she used the double opt-in. Since it wasn’t something that was mission critical, I told her my thoughts on the double opt-in, but I let her design stay in place.


Just as I suspected, a full 23 percent of the people who answered the CTA never confirmed the email we sent them. She insisted that we would see better results long term. So, I threw together a second version without the double opt-in, and we started testing and collecting data. Six months later it turned out that she was right. My form received more subscribers, but her form received many more subscribers who truly engaged with us, and who actually made purchases.


Takeaway: The double opt-in results in a smaller, yet clearly more useful subscription list.


Social Proof – A Surprising Negative in Some Cases


This one is the most recent surprising A/B test result that I have seen. I was working on a fairly complex landing page. It was for a large property management firm. Basically, customers would enter in criteria of where they wanted to rent a home or an apartment, how big of a space they needed, how many bedrooms, etc. Then, after hitting submit, they would arrive on the landing page I created, which took their input and returned a grid of properties meeting their criteria. This included address, phone number, monthly rent, etc.


Not all of the properties were owned by the management company I was working with. The properties owned by the property management firm were grouped together with all of the information in a different color than the other properties. Below that I placed a call to action button offering a discount off of the first month’s rent if they clicked the button to fill out an application. I decided to add an additional blurb to each of the properties owned by the management firm stating that the company was a member of the local commerce and growth association. After all, I thought, a bit of social proof couldn’t hurt.


Fortunately, because the page was so complex, I had put a plan in place to A/B test every single change. Within a day, conversions were down over 15 percent. The downward trend continued until conversions on the B tests were down 19 percent and showed signs of declining further. On the other hand, our A tests remained almost completely stable. So, we pulled the social proof.


In retrospect, I think that my attempt to add social proof may have actually reduced trust. Before that, even though it was obvious they were coming to a page with the intent of getting people to apply for leases at a particular property company, people appreciated that we provided information for all properties. I think that by adding the social proof only to one company’s properties, it made the page seem less informative and more like a hard sell effort.


Takeaway: Social proof is great unless it makes customers skeptical about your motives.


toolbox testing


Testing Tools


If you’re interested in taking your A/B testing to the next level, there are plenty of great tools to make that a possibility for you. The following are just a few of the tools that I have used and been impressed by over the years.



  • VWO – This is the tool I recommend for people who are just getting started with serious A/B testing. The tool makes it quick and easy to create multiple versions of a single web page by changing headlines, call to action buttons, and other elements. Then, as you are testing, the tool provides you with all of the analytical data that you need.
  • Kissmetrics – If you are already using Kissmetrics for your analytics collection, you can also take advantage of its A/B testing functionality. This tool randomly assigns each visitor to one of the variations you are testing. Then, it remembers who that visitor is and ensures that they come back to that specific variation on subsequent visits. The result is that your tests are consistent and return good data.
  • 5 Second Test – This one is not strictly an A/B testing tool, but it is definitely useful for testing changes to your landing pages. This test lets you know what people remember after looking at your landing page for just 5 seconds.

Conclusion


Does all of this mean that web designers and developers should just throw all of the rules out the window? Of course not. In most cases, the best results will come from following your instincts and by sticking with rules of design. After all, these have been established for good reason. Overall, the rules of design are based on solid science. However, the point is that nothing should be considered above testing and that we should never allow our assumptions or the rules get in the way of solid research, testing, and most importantly data.

Digital & Social Articles on Business 2 Community

(80)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.