January 21st, 2013 | 3 MIN READ

Test Your Gut: 3 AB Tests from the Gaming Industry

Written by author_profile_images Linda Bustos

Linda is an ecommerce industry analyst and consultant specializing in conversion optimization and digital transformation.

If you're passionate about A/B testing, you're likely familiar with the site WhichTestWon, a weekly showcase of A/B testing examples from a variety of industries with a twist - you get to test your own "gut feel" and predict which test won before revealing the results and details about each case. (If you have a case to share, WhichTestWon is currently accepting entries for its annual Testing Awards, deadline for entry is January 25).

Today's post looks at examples from WhichTestWon's gaming industry archive covering 3 conversion goals: social sign on, product registration from offline purchase, and user registration. Before reading the results, take a moment to predict which design outperformed the other.

Sign In With Facebook

Version A:

Version B:

One of these landing pages resulted in more than 2x the clicks on the Facebook Sign In button.

The winner was Version A, with its BOB (big orange button) and clear call to action. Surprisingly, 55% of voters got this one wrong.

Product Registration

Version A:

Version B:

Which presentation resulted in 40.1% more non-required registrations after installing a game purchased offline?

A whopping 78% chose Version B, but Version A was the winner. The value propositions in Version A are more specific than B's -- gamers expect tips and tools to be free, and "content" is less tangible than "free town." Version A also implies more value, points redeemable at a Sims Store is more compelling than simple access to free items made by other players.

This test also demonstrates that despite the popularity and fame of a game like Sims 3, gamers are influenced by copy. If it was just about demand for the game, we wouldn't see such a spread in conversion.

User Registration

Version A:

Version B:

Which version increased free registrations by 6% and paid sign ups by 15%?

53% got this one wrong -- it's version B. If you're eyes are keen, you noticed the log in area at the top of the page was also removed on the winning version. IMVU first tested 640 recipes using MVT (multivariate testing). This A/B test was a follow-up to validate MVT results. Multivariate data revealed removing the log in area was responsible for 10.2% lift, vs just 3% for the hero shot.

Testing inspiration

No matter what industry you're in, these tests are all good examples of what you should be testing - calls to action (placement, design, labeling, colors, etc), presentation of value propositions (headlines, body copy, format, placement) and testing radical redesigns where you may discover something as simple as a log in area has an impact.

The stats on how many professional online marketers guess the winner wrong reminds us how important a testing program is to conversion optimization.

If you're looking for a great event to learn more about testing, check out WTW's Live Event in Austin, Texas this May. Or, register to speak and share your own testing expertise.

Share on


Thanks for signing up!

You'll receive a welcome email shortly.

By submitting this you agree with our privacy policy.