This is part 4 of a 4 part series on conversion optimization. For part 3 on Testinggo here. To begin the series with part 1 on Targeting, click here.

 

The Results are In! This is Why You Should Be Testing


By now you're feeling like a true CRO rockstar. You've clearly identified the actions and goals you want completed on the site. You've tightened up your marketing and worked methodically on analyzing your conversion research and findings. You've developed a testing roadmap based upon pages you've scored, and from that you came up with dozens of tests to run. With confidence you jump in, create your first hypothesis and A/B test, and even create additional variants to test against your original page. You're ready to pull the trigger and start, but wait...something scary crosses your mind. What if I fail?

Don't worry. This is a reasonable fear to have when you do this kind of work. The odds of you getting it right the first time around are against you, so be prepared to run several test rounds for each page. Remember, NOBODY knows for certain which test is going to work and testing is truly about learning, not winning or losing. Ultimately, some tests will win and produce a nice lift, while the majority will perform the same or even worse than the original. So what do you do, just give up? Of course not!
 

 

A study by ConversionXL observed that only 1 in 7 tests produce a winning variation. That's just a 14 percent win rate. Horrible, right? Wrong! What matters is that you learn from each test, update your hypothesis, pick yourself up, and get back in the ring and test again. With experience and time you'll start to create better hypotheses and ultimately increase your winning percentage.

Losing a test also doesn't mean that you did something wrong, which is why you need to make sure everyone from the beginning understands that testing is about learning. No one ever increased performance by sitting on their hands and doing nothing. Rather, pat yourself on the back for getting out there and trying to make a difference ;). The bottom line is you can always learn a thing or two from failures. Results, good or bad, will always provide insights for further testing.

Testing in Action With Experience Columbus


After working with Simpleview to create and launch a multi-award-winning website, experiencecolumbus.com, the Greater Columbus Convention & Visitors Bureau decided to focus their efforts on increasing their eNewsletter sign ups. In doing so, they again partnered with Simpleview, this time on a CRO engagement. 

After some initial analysis of the Experience Columbus site, our CRO team noted that while their homepage provided a call- to-action in the footer to sign up for the eNewsletter, other internal pages on the site did not. So we threw out the preconceived notion that “no one scrolls to the bottom of the page” and decided to test a footer change. Our team created and implemented a variation of the footer for the interior pages of the site that served as a consistent call-to-action with an opportunity for visitors to sign up for the eNewsletter. 

The Results

After testing this additional footer option for 15 days, reaching across approximately 50,000 visitors, our team reached enough statistical significance to support making the change. Results showed a 513 percent increase in conversions and 518 percent increase in conversion rate (average conversions per session). 

At the end of the day, there are no secret recipes, formulas or magic tools that get you these kinds of results. You'll win some and you'll certainly lose some, but ultimately it boils down to hard work, tons of research, and a solid process to get the most out of your tests. 
 

Editor's Note: This post was originally published in May 2016 and has been updated for accuracy and comprehensiveness.