A/B testing (also known as split testing) is a method of comparing two versions of a webpage against each other to determine which one performs better. By creating an A and B version of your page you can validate new design changes, test hypothesis, and improve your website's conversion rate.
Testing takes the guesswork out of website optimization and enables data-backed decisions that shift business conversations from "we think" to "we know." By measuring the impact that changes have on your metrics such as sign-ups, downloads, purchases, or whatever else your goals may be, you can ensure that every change produces positive results.
Quantitative data speaks for itself. You and your coworkers may have hunches about how site visitors will respond to certain design elements; A/B testing allows you to show visitors two versions of the same page and let them determine the winner. Constantly testing and optimizing your page can increase revenue, donations, leads, registrations, downloads, and user generated content, while providing teams with valuable insight about their visitors.
An A/B test involves testing two versions of a web page — an A version (the control) and a B version (the variation) — with live traffic and measuring the effect each version has on your conversion rate. Start an A/B test by identifying a goal for your company then determine which pages on your site contribute to the successful completion of that goal. This differs from multivariate testing, which tests out multiple variations of a page at the same time.
Imagine a company, Acme Widgets, that operates a web store selling widgets. The company's ultimate goal is to sell more widgets and increase their yearly revenue, thus the checkout funnel is the first place Acme's head of marketing will focus the optimization efforts.
The "buy" button on each product page is the first element visitors interact with at the start of the checkout process. The team hypothesizes that making the button more prominent on the page would lead to more clicks and therefore more purchases. Using Optimizely's A/B testing software, the team simply makes the button red in variation 1 and leaves the button grey in the original. They are able to quickly set up an AB test using an AB testing tool that pits the two variations against each other.
As the test runs, all visitors to the Acme Widgets site are bucketed into a variation. They are equally divided between the red button page and the original page. Optimizely provides an A/B testing framework that measures the number of visitors who saw each version of the button and then clicked it. It also measures the number of visitors who completed the purchase funnel and landed on the "Thank You" confirmation page.
Once enough visitors have run through the test and Optimizely indicates that the results are statistically significant, the Acme team ends the test and is able to declare a winner. The results show that 4.5% of visitors clicked on the red buy button and 1% clicked on the original version. The red "Buy" button led to a significant uplift in conversion rate, so Acme then redesigns their product pages accordingly. In subsequent A/B tests, Acme will apply the insight that red buttons convert better on their site than grey buttons.
What assumptions are you making about your site right now? With Optimizely's A/B testing software you can easily start running experiments, even without a developer. Load your site into the Optimizely editor, click "Test it Out" and see how simple it is to set up an AB test right now or visit our Resource Library for A/B testing examples, best practices, and more.