AB Testing Case Study: The Dangers of Overreacting
This is a guest post / case study contributed by Dr. Pete Meyers of User Effect.
When economic times are tough, even the most level-headed of us sometimes make the mistake of changing our websites just for the sake of change. If the money’s not coming in, something must be broken, right? Maybe, but without knowing why it’s broken, you’ve got a good a chance of breaking it even more. Consider this post a parable in the dangers of overreacting.
Step 1. The Imaginary Problem
Recently, I was working with a client in the event industry who was concerned about the bounce rate on a key page and the strength of their call to action. To make a long story short, we had a page with multiple buttons, calling users to 3 different actions based on their preference. Those buttons looked something like this (altered slightly for client anonymity):
The client felt that we simply weren’t creating enough urgency with the relatively weak “Enroll Online” call to action. There was a certain logic to that argument, and it fit some conventional wisdom on conversion tactics. As usual, the biggest mistakes often have a certain logic to them.
Step 2. The Ill-advised Solution
So, we set about creating 3 variations on the first button. The logic was that “Enroll Online” was too vague and wasn’t conveying a sense of urgency, so we created a version with “Now” and the more emphatic “Now!” Exclamation points are always a wild-card in calls to action, in my experience, so it seemed worth testing. The new buttons (versions A, B and C), looked like this:
Step 3. The (almost) Disastrous Results
Fortunately, we ran a split-test (A/B/C) through Google Website Optimizer and saw a disturbing pattern very quickly. The most emphatic option (“Enroll Now!”) was showing a 37% drop in conversion! Although version B eventually leveled out a bit and only showed a minor loss, both “Now” options performed worse than the original.
Interpretation is always the hardest part of testing, but I think the story goes something like this. In our rush to make a change, we forgot something very important: most of our visitors require multiple visits to convert. They typically have to compare events, get budget approval, etc., and tend to come back a handful of times before enrolling. By pushing them too hard to enroll Now!, we ultimately forgot who our customers were.
Lessons in Overreacting
Of course the lesson here is an old one: “If it ain’t broke, don’t fix it”. We didn’t have the evidence to suggest our buttons were at fault, and in the rush to change the site, we fixed something that should have been left alone. The second lesson is that testing might just save your life, or at least your ROI. Ultimately, this negative change only went out to a third of our visitors for a couple of weeks. Had we rolled the change out without testing and run it for a few months, that 37% conversion drop would have easily cost my client thousands of dollars.
Dr. Peter J. Meyers is the President of User Effect, a former start-up executive, cognitive psychologist, and (nearly) lifelong programmer. For the past 11 years, Dr. Pete has worked directly with business owners and executives to improve their online return on investment. He has recently published a new e-book: Converting The Believers, a guide to using analytics, usability and testing to drive online sales.
Like what you’re reading?
Check out some of our other great content here
Get actionable insights on ecommerce trends and best practices
You'll receive a welcome email shortly.