Sony A/B Tests Banner Ads
When users land on a website, they search for relevant information and want to easily find the product that drove them to the particular website in the first place. If the content or the calls to action lack relevance and don’t spark interest, then site visitors tend to not engage or abandon the site completely. Evelien Geerens works as a web merchandiser at Sony Europe. She was facing the same challenge as other websites, and believed A/B tests could offer a solution.
During a recent site audit, Evelien discovered that banner ads promoting customizable laptops on Sony’s European homepage and product pages were underperforming—clickthroughs and purchases from the banners were lower than usual. Evelien set out to fix this conversion problem with A/B testing.
A solid hypothesis is the first step to solving a challenge like the one Evelien and Sony were facing. The banner ads in question presented two different calls-to-action—one advertising the customizable Sony Vaio notebooks and one promoting a monthly offer for a free memory upgrade. Evelien hypothesized that visitors were overwhelmed by seeing two different calls-to-action, making them less likely to click through on the banner. However, her team’s usability research suggested otherwise. In Sony’s UX tests, Evelien learned that some site visitors were actually put off by the customizable laptop option. These visitors believed the customization process would be too time-intensive and not worth the outcome. Faced with contradicting opinions, Evelien turned to Optimizely to let live visitor data show which one of the banner ads would work best.
Using Optimizely, Evelien built an A/B/C experiment to test two different variations of the banner against the original. With Optimizely’s targeting feature, Evelien targeted the experiment only to site visitors from the Netherlands and the United Kingdom—two of Sony’s largest markets for the customizable laptop campaign. One variation focused entirely on the customizable laptops, while the other focused only on the sitewide offer (a promotion on additional storage).
The overall goal of testing these different ad variations was to get behind the customer and understand what the shoppers at Sony were driven by – Were they looking for deals or did they care about a personalized laptop? Evelien’s goal was to measure how each banner impacted the number of visitors who clicked through the banner and then eventually entered the checkout funnel. She measured this by setting two goals – a click goal on the banner itself and a pageview goal to measure each time site visitors reached shopping cart landing pages after clicking through the banners.
While UX research and analytics helped Evelien create a hypothesis, only A/B testing the banners could help Evelien find a clear winner. Variation 1, the banner that focused on customization, saw a 6% increase in banner CTR and led to a 21.3% increase in visitors that reached the shopping cart compared to the original banner. Variation 2, the banner focused on the sitewide promotion, only increased click-throughs by 1.8% and actually performed worse than the original in terms of shopping cart views, decreasing conversions by 2.9%.
How do you learn even more about your visitors?
“￼Testing helps you make decisions based on objective results, not subjective guesses,” said Evelien. After having learned which variation of the banner visitors to the Sony website preferred, Evelien wanted to dig even deeper and see how visitors from different sources and geolocations reacted to the different variations. In order to only include visitors from the UK and the Netherlands in the results, Evelien had targeted visitors by geo-location. She found that there was no significant difference when segmenting visitors—visitors from both countries preferred the banner focused on personalization.
A big surprise came when Evelien decided to segment the results of her experiment by the visitor’s device type. Contrary to the results she saw before, mobile visitors were more inclined to click on the banner focused on promotion. The variation that highlighted only the promotion outperformed the original by 21% and also beat the banner that focused on personalization. What did Evelien learn through this? Segmentation is a powerful technique to learn which visitor groups display different preferences depending on which device they used to access the site.
One test informs the next
Now that Evelien found a better performing banner to get visitors to enter the funnel where they would configure and personalize their laptop, she then wanted to focus on increasing the number of people that would complete the process of personalizing their laptop. Using her analytics tool, she could easily find the page in the configuration funnel where most people dropped out of the process. This gave her a perfect place to start an experiment that could have the highest impact on conversions.
Find out how Evelien set up this follow-up experiment, and how Optimizely helped her reduce the number that abandoned the configuration funnel by more than 20%.