What is A/B/n testing?

A/B/n testing is a type of website testing where multiple versions of a web page are compared against each other to determine which has the highest conversion rate. In this type of test, traffic is split randomly and evenly distributed between the different versions of the page to determine which variation performs the best.

A/B/n testing is an extension of A/B testing in which two versions of a page (a version A and version B) are tested against each other. However, with an A/B/n test, more than two versions of a page are compared against each other at once. “N” refers to the number of versions being tested, anywhere from two versions to the “nth” version.

A/B/n testing can also be contrasted with multivariate testing. A multivariate test also compares multiple versions of a page at once by testing all possible combinations of variations at once. Multivariate testing is more comprehensive than A/B/n testing and is used to test changes to specific elements on a page. A/B/n testing can be used to test completely different versions of a page against each other.

Why is A/B/n testing important?

A/B/n testing helps you understand which website design helps you generate the most engagement and conversions from your users. You can test multiple pages against each other at once and use data to determine which variation you should go with.

When a company has more than one competing ideas for what the best website layout would be, A/B/n testing can be used to test each idea and render a decision based on concrete data that shows how one version outperformed others.

In addition to helping which version of a page is most successful, A/B/n testing also shows which version of a page performed the worst. By analyzing these low performing pages, it is possible to come up with hypotheses for why certain features convert better than others, and these lessons can then be incorporated into new tests on other pages of the site.

An A/B/n testing case study:

A real-world example of A/B/n testing in action is when Electronic Arts released a new version of their popular SimCity franchise in March of 2013. In their A/B/n test, EA created and tested several different versions of their pre-order page against each other to see which one performed the best.

The EA team found through the test that the version of their page without a special promotion offer across the top of the page performed 43% better than the other variations in the test.

Not only was EA able to see massive increase in pre-orders due to the test, but they were able to apply the lessons learned from the test to other pages across their site, and see increases in conversions sitewide (read the full EA case study here).

Potential downsides of A/B/n testing

Testing too many variations (when one can't be decided upon) can further divide traffic to the website among many variations. This can increase the amount of time and traffic required to reach statistically significant result and create what some might call “statistical noise” in the process.

Another consideration to be mindful of when running multiple A/B/n tests is to not lose sight of the bigger picture. Just because different variables performed the best in their own experiments, it doesn’t always mean those variables would work well combined. Consider running multivariate tests to test all variations and make sure that improvements to top-level metrics carry all the way through the conversion funnel.

Using Optimizely for A/B/n testing

Optimizely provides a platform to carry out an A/B/n test efficiently and accurately. It carries you through the testing process seamlessly, provides you with detailed test results at the end and provides a visual editor so you can make changes without having to code.

Start testing with Optimizely today and take advantage of the power of A/B/n testing!