December 2, 2015

How to run more effective A/B tests: An Optichat recap

a person with the mouth open

Cara Harshman


Good ideas are the foundation of our work as optimizers. The source doesn’t matter — whether it comes from our heads, a user testing session or our website analytics — as much as the quality. Instead of one-off ideas, we need to equip ourselves with scalable strategies for brainstorming and running effective A/B tests with solid hypotheses. Hiten Shah, founder CrazyEgg, co-founder of KISSMetrics, curator of SaaS Weekly, and author of hitenism.com joined #optichat to discuss tactics for running effective A/B tests.

optichat hiten shah

Here’s a recap of the questions, answers from Hiten and the #optichat community.

What are the attributes of a good A/B test idea?

From Hiten:

  • When A/B testing make sure you’re documenting the experiments before and after. Enables faster experimentation and growth.
  • An A/B test hypothesis is an educated guess informed by data from user behavior, research and past experiments.
  • When A/B testing doing research helps make sure you are focused on the right things

From the community:

  • If you ask “what should we test next”, you have no idea what you’re doing. Do conversion research!  –@peeplaja
  • Research based hypothesis. Every test should answer a question.  –@MikeStLaurentWF
  • Lay the foundation for a few deeply-researched A/B tests. Testing begets testing, so often the strategy becomes domino effect –@KMRivard

Helpful articles:

See all the answers to question 1 here.

Why is it important to spend time researching A/B tests?

From Hiten:

  • You’ll severely limit your long-term growth by not doing research when coming up with A/B test ideas
  • Research helps you focus on creating better experiments that are more likely to improve your funnel & reduce user friction

From the community:

  • It’s important to research your A/B tests because it increases chances of success. Too much failure kills org support. –@seanellis
  • I keep a log of all tests – what we tested, what we learned, how long the tests ran, why we felt it worked/didn’t work, etc. –@Beymour
  • Research helps me from re-inventing the wheel and focusing on experiments that will move the needle. –@bl_bennett12
  • Research helps you find the global maximum and get over the local maximum. –@JenPwns
  • Research takes your optimization program away from opinion and toward empathy. –@avramescu

See all the answers to question 2 here.

What are the main sources of data you use in A/B test research?

From Hiten:

  • Quantitative data mostly comes from marketing channels, funnels, retention cohorts and product usage data
  • Qualitative data can come from surveys, interviews, user research and any other ways to understand why users do what they do
  • If you want to get growth by A/B testing you have to get the knowledge out of people’s heads and start documenting it

From the community:

  • We get a good amount of testing ideas from talking to the non-web/tech ppl in the office. What are their UX pain points? –@lmaaa
  • Voice of customer, analytics, heuristics. eye tracking, click tracking, time on site, etc. -@FormosaChris
  • industry standards – blog ideas, industry standard conversion rates, competitors’ sites – can round out your data sources too –@herartofseeds
  • Also don’t forget results of A/B test can and SHOULD influence what you test next –@BryantGarvin

Helpful articles:

See all the answers to question 3 here.

Besides being pretty, what are heatmaps good for?

From Hiten:

  • Heatmaps help improve your website by informing you as to what page elements you can add, remove and move when A/B testing
  • Heatmaps help with A/B testing because they show you exactly where people are clicking and where they are not
  • Heatmaps can be a very useful tool for understanding what people are doing on your website

From the community:

  • When using heatmap for your research, remember to look at not only your desktop heatmap, but also mobile and tablet –@idowebid
  • Heat maps highlight the gap between what you’ve designed and what users are doing. –@MRSallee
  • What elements of your page are users paying attention to? Are you guiding them along the journey or are they getting lost? –@milwaukeePPC
  • Treat heatmaps as an analytics source. You can view each page as a step in the funnel to ID friction points and bottlenecks. –@avramescu

See all of the answers to question 4 here.

What are some great ways to prioritize and keep track of your test ideas?

From Hiten:

  • Whatever you do, if you just start documenting your experiments, prioritization becomes much more straight forward
  • This growth canvas tool is specifically designed to help in prioritizing & tracking A/B tests

From the community:

  • We always use mind maps at our office. First you brainstorm, then you prioritize and group related ideas. –@egaal
  • There’s a lot of Excel in my life… –@SamanthaSawyer
  • We always use mind maps at our office. First you brainstorm, then you prioritize and group related ideas. –@avramescu
  • I really like as a tool for prioritizing and tracking test ideas and archiving results. –@granttilus
  • The folks at are working on a great project management tool for A/B tests –@takeshiyoung

Helpful articles:

See all of the answers to question 5 here.

How do you think about win rate? Is this a metric A/B testing teams should try to improve?

From Hiten:

  • If you have a very low win rate (1 test out of 10 wins), you should start informing your experiments with more research
  • Whether a test wins or loses, make sure you learn and keep testing. That’s how you’ll get a higher win rate in the long run

From the community:

  • Be careful with incentivizing a/b test win rates, we tend to forget about metrics that also shouldn’t go down –@davidjbland
  • NOO! It’s about insights. You take what you learn, modify your approach and get going again. Growth doesn’t happen on day 1! –@AswinKumar
  • Losers can also save you from going down costly paths. “Aren’t you glad you tested this first?” –@avramescu
  • It’s hard to maintain a high win rate, but my team averaging around 50% wins on our tests. Maybe we’re too conservative! –@seanellis

See all of the answers to question 6 here.

Which A/B testing cliche would you debunk?

From Hiten:

  • Testing isn’t just about all the tactics you read. You have to actually test them for yourself!
  • All those tactics you read about can’t get you growth without a system for experimenting and learning from your own users

From the community:

  • Testing for testing sake. Just because you can turn something on/off doesn’t mean you will get results. –@altos
  • “I don’t have enough time to test.”  –@lmaaa
  • The Big Orange Button is always a winner –@ohdubz
  • Debunk “everyone should test and always be testing”. Only if you have the volume –@CherryManrao

See all of the answers to question 7 here.

How can you convince your company to be more experimental in 2016?

From Hiten:

  • In a world full of competition, your best advantage comes from understanding your customers better than anyone else
  • Look around your company and if people are making decisions that seem like guesses, it’s time to start experimenting
  • Continuous testing helps you learn how to solve customer problems as they come up instead of as competitors create solutions

From the community:

  • Keep telling people that “it’s just an experiment,” not a decision to go with one thing or another –@caraharshman
  • If you are not testing you are essentially gambling business performance on gut feeling –@bhas
  • Document a rigorous process, begin to show small wins, build a culture of test ideation –@JoshRodriguez
  • Answer every single comment with “You Should Test That” Nobody knows the impact of anything without testing –@MikeStLaurentWF
  • I like to start with a list of things that need to be fixed – with supporting evidence. This starts momentum –@seanellis

See all of the answers to question 8 here.

cheers