Even though you don’t want to spend too much time letting yourself get lost in the early returns of your experiments, you may want to go in as results are accruing to look at your results page and learn a little about what is happening during an experiment. And sometimes it’s just too tempting not to look! So let’s take a peek at an experiment Attic and Button is running and see what we can learn, even though the results aren’t quite at statistical significance yet. You’ll need to interpret the results and answer a few questions about what you see to help you understand how to review your results.
For this experiment, Attic and Button decided to try adjusting the Call to Action on their product page. In the first variation, depicted with a fuschia line, the team changed the CTA button itself, making it larger and changing its color so it would stand out. In the second variation, they chose to simply keep the button the same size and move it above the fold on the page so it was immediately visible to users.
Take a look at the results and see if you can interpret them.
It’s already established that they haven’t reached statistical significance (you can easily see that by looking at the bar that’s currently gray in the top part of the page--it’ll turn red for losing variations and green for winners when you reach stat sig). But how close are they? How many visitors does the CTA Changed variation need to reach stat sig?
At this point, which variation is winning based on the information you have available?
What is the improvement percentage for the CTA Changed variation?
What’s the conversion rate for the CTA Higher Up variation?
How many visitors have seen the CTA Changed variation?