If you move in exciting circles like I do, you will no doubt be familiar with some of the caveats you hear about A/B testing.
For those of you with more sheltered lives, A/B testing is a technique to determine which version of a web page is more effective in influencing visitors’ behavior.
Some visitors will see one version of a page while others see a different version. You keep track on what happened on each version, then proceed with the most effective option.
The image above, for example, shows the 2008 Obama campaign website where the only difference was the wording of the red button. See if you can guess which one was most effective. The results are at the bottom of this article.
Even small changes in A/B test options can lead to big differences in user behavior.
It is especially important on eCommerce sites where a swing of a few percentage points could impact sales. Indeed, for startups, these can be existential differences.
But behind all the altered layouts and tweaked colors, it is important to remember one thing, according to Jennifer Golbeck (@jengolbeck), “We have no idea why one of these performed better than the other.”
On the tests, she said, “We can come up with theories, but ultimately there’s no insight in to the why.”
Collecting the data may feel like an exact science, he said. However, it comes with a temptation to build a narrative around the results.
“But there is no knowledge there. There is just a metric,” King warned.
As for the Obama campaign, the results are below. The “learn more” button saw 18.6% more sign ups than the original “sign up” button.
It might be fun to speculate why this was the case. Just remember that you don’t know.
Images are partial screen grabs from Coursera. The website developers, Optimizely, explain the results in more detail on their blog.