Insound was one of the very first online music stores. Always a niche business with a great brand and loyal following, the site sells vinyl, turntables, t-shirts and silk-screened posters to the most nerdy of music hipsters. Though the company has out-survived all of the competition, their business still carries razor thin margins and the Insound team is a small one. Without the luxury of big customer-acquisition budgets or the breathing room to make big mistakes, they rely heavily on iterative conversion optimization to drive growth.
In late 2012, Insound launched a disruptive new checkout flow that looked great – especially on tablets and phones – and had very bold form field validation. They were emboldened in the new visual treatment based on successful preliminary user testing. But the experience was unlike any checkout their users had seen before and the team soon found that conversion rate was actually underperforming after the launch of this new design. Insound asked us at Clearhead to dig into these challenges and provide solutions…in a hurry. Examining the checkout funnel, we believed that performance would stabilize in the long term, but hypothesized that, in the near term, the length of the checkout process and the vague wording throughout were contributing to erosion of conversion rates.
Finding a quick solution to mend the checkout funnel was key for Insound since they had holiday marketing promotions set to launch in the near future. But they didn’t want to spend any marketing dollars on promos that would send a visitors through a leaky funnel.
So Insound came to us to implement a testing software, create a test plan, and drive results in only 2.5 weeks – short order by anyone’s standards. Both we and the Insound team suspected that repeatedly using the “Continue” button in the funnel was causing confusion and leading to abandonment on Bill/Ship details page of the checkout flow. We believed a more specific or enticing call to action button would improve conversion rates.
While a button test is a relatively simple one, in this case, we hypothesized that it could have a potentially huge impact on conversion and have the funnel optimized by the targeted mid-November promotion launch. Using Optimizely, we tested four button variations: “Submit,” “Almost Done,” “Review Order” and the original, “Continue.” We let the test run for nearly two weeks. Between us, we put our money on the punchier variation of “Almost Done” as the winner.
But even digital optimization strategists cannot predict the habits of thousands of website visitors. Not only did “Review Order” increase conversion on the page by 8%, it beat all other variations by well over 30%. With the “Review Order” button, this page now has a 54% conversion rate – more than half of visitors move on to the next page where they actually review the order and check out. Annualizing the resulting increase in conversion rate produced a nice revenue lift and because this revenue didn’t come from additional sales & marketing spend, it’s very margin rich.
We believe this experiment provides four key lessons for testers:
Don’t rely on user testing. User testing is great for quick and light validation but it does not guarantee success. Ultimately A/B testing yields much greater confidence.
Tablet and desktop are not the same. You should not assume that disruptive experiences that look (and perform) great on tablets or phones will translate well to the desktop, where the majority of shoppers still exist. Again, use A/B testing to validate even the simplest hypotheses.
Be precise in the checkout funnel. Customers want to know exactly where they are in the process and how many steps remain until the end.
Testing works on a tight timeline. Don’t worry! Even in a pinch, it’s possible to find conversion increases. Together with Optimizely we were able to deliver material improvements on a tight timeline for Insound.com. You can as well.
When were you able to drive meaningful change on such short order through testing?