A/B testing (also known as split testing or bucket testing) is a method of comparing two versions of a webpage or app against each other to determine which one performs better. By creating an A and B variant and testing them against each other, you can use data & statistics to validate new design changes and improve your conversion rates.
Running an AB test that directly compares a variation against a current experience lets you ask focused questions about changes to your website or app, and then collect data about the impact of that change. Testing takes the guesswork out of website optimization and enables data-informed decisions that shift business conversations from "we think" to "we know." By measuring the impact that changes have on your metrics, you can ensure that every change produces positive results.
In an A/B test, you take a webpage or app screen and modify it to create a second version of the same page. This change can be as simple as a single headline or button, or be a complete redesign of the page. Then, half of your traffic is shown the original version of the page (known as the control) and half are shown the modified version of the page (the variation).
As visitors are served either the control or variation, their engagement with each experience is measured and collected in an analytics dashboard and analyzed through a statistical engine. You can then determine whether changing the experience had a positive, negative, or no effect on visitor behavior.
A/B testing allows you to make data-driven decisions instead of hunches and guesses. Rather than launching features to your website or app and then hoping for the best, testing allows you to confirm or disprove your hypotheses before committing to changes.
Testing allows you to optimize your site or app experience to improve conversion rates in a methodical way. A higher conversion rate means getting more value from your existing users instead of having to pay more on traffic acquisition.
Aside from business results, testing can help to transform your workplace culture and create a more data-driven environment. Rather than simply following the opinion of the Highest Paid Person's Opinion (aka HIPPO) you can use data and facts to determine the direction of your product.
Whether you are a marketer, designer, or developer, AB testing is a simple way to use the power of data & statistics to reduce risks, improve results, and become more data-driven in your work.
The following is an A/B testing framework you can use to start running tests:
If your variation is a winner, congratulations! See if you can apply learnings from the experiment on other pages of your site and continue iterating on the experiment to improve your results. If your experiment generates a negative result or no result, don't fret. Use the experiment as a learning experience and generate new hypothesis that you can test.
Whatever your experiment's outcome, use your experience to inform future tests and continually iterate on optimizing your app or site's experience.
The following is a list of ideas to get you started with testing. A/B testing best practices for what to test can vary by industry, so the ideas are broken up by vertical:Media
A media company might want to increase readership, increase the amount of time readers spend on their site, and amplify their articles with social sharing. To achieve these goals, they might test variations on:
A travel company may want to increase the number of successful bookings are completed on their website or mobile app, or may want to increase revenue from ancillary purchases. To improve these metrics, they may test variations of:
An e-commerce company might want to increase the number of completed checkouts, the average order value, or increase holiday sales. To accomplish this, they may A/B test:
A B2B company might want to increase the number of high-quality leads for their sales team, increase the number of free trial users, or attract a specific type of buyer. They might test: