Sean Ellis is a startup marketer with experience growing early-stage products into household brands (his work includes Dropbox, Eventbrite, Lookout, Xobni, and more.) Today, Sean is also the CEO of qualitative insights survey platform Qualaroo. Follow him @SeanEllis, on his blog, or as an active contributor on GrowthHackers.com.
As one of the formative voices behind the growth hacking movement in Silicon Valley and beyond, Sean spends a great deal of time discussing the benefits of conversion rate optimization and A/B testing. We recently asked him to share tips for running better experiments and his outlook on the optimization industry.
Optimizely: What do you think is the most common misconception about Conversion Rate Optimization and A/B testing?
Sean: The number one misconception is that A/B testing is simply about running a test here and there and hoping for improved results. Without an optimization process that focuses on continual improvement A/B testing often fails to live up to its promise. Without organizational rigor to make A/B testing a priority, companies give up failing to see early wins, which ultimately costs them the long-term gains that come from a systematic approach to optimization.
Do you think that most companies online are effectively communicating with their customers? Why or why not?
Most companies lack a process for regularly collecting and then taking action on user feedback. Users provide feedback in numerous ways, from bouncing off of web pages, to taking surveys, leaving reviews, filling out customer support tickets and posting on social media. With all of this feedback coming in, you’d expect companies to be constantly processing it and using it to improve the visitor experience, and ultimately, their business. But more often, feedback is triaged to manage customer complaints rather than used for true learning and business improvement. The most successful companies have a process for collecting, parsing and using customer feedback to improve their business.
What is the best way to get actionable data from a website survey?
The single best way to get actionable qualitative data from a website survey is to use them to try to understand specific user behavior. For example, using a website survey on pages that have high bounce rates, or pages within your conversion funnel have high drop-off rates, can give you insights right from the customer that help you understand why they’re leaving. When you understand why a behavior is happening, it’s much easier to take action on the feedback and try to change the behavior.
Do you have any tips on how to choose which feedback should be incorporated into your testing pipeline?
Feedback from qualified people is the most important. The people that are potential customers who aren’t converting are the ones you want to focus on. Ignore the people who aren’t qualified or interested in what you’re offering from the beginning. For example asking people who converted a question such as “What almost stopped you from signing up?” or “What made you decide to sign up?” helps you understand the needs of qualified visitors. This qualified feedback will help you sort through the data from exit surveys that include both qualified and unqualified responses.
What would you tell someone who is looking for help creating strong hypotheses for their tests?
There are two ways to create a hypothesis. In the first case you can look at the data and then spend hours or days with your team trying to interpret what the data means and what you should test next. You can create hypotheses from your interpretation and from your team’s opinions. Or you can ask visitors to that page what the actual problem that they’re encountering is. No need to interpret data, just ask visitors and get immediate feedback. I believe asking visitors what issues they’re encountering on your site, conducting user research, is the best way to formulate strong hypotheses that make for valuable tests.
(For more tips on how to incorporate data into your experiment hypotheses, download the guide to Building your Company’s Data DNA.)
How do you think companies should approach staying focused when it comes to optimization? How do you measure the progress of your CRO program?
Having the organizational rigor to stay focused on conversion optimization is the hardest part of this process. A/B testing is not a one-off project—rather it is a continuous process of improvement that needs to be in motion at all times. eConsultancy reports that 87% of companies doing A/B testing run between 1 and 5 tests each month. The best companies run many times that number.
Companies need to commit to A/B testing as a core part of their digital marketing program and invest in it accordingly. Stick to the process and eventually it will become an addictive habit, with the organization constantly trying to outdo its previous test. Systematic, ongoing A/B testing is going from a competitive advantage to a competitive necessity. It’s a key requirement for online success today.