Conversion rate optimization (CRO) is a beautiful thing. You’re already spending time and money to drive traffic, so improving your conversion rate means that you’re making your existing efforts more efficient. It creates a multiplier effect, making every visitor more valuable to you. And, with a variety of tools on the market, it can be done quickly, by anyone, limited only by your imagination and available traffic.
But while there are plenty of studies that can show you how well conversion optimization works, what they don’t show you is how differently it works for everyone. The recipe for CRO success is highly variable among industries, companies, and customer segments. No matter how smart you think you are, you’re in for an education. This is a recap of my own recent CRO education (complete with fancy graph).
I’m going to keep this high-level, and I won’t be able to share any absolute numbers (competitive intel, not wanting to get fired, etc). But, I hope the experiences I share will add at least one more landmark to the wilderness.
First Day of Class
Check out the chart above. It shows 16 months of data for a large Adwords account. My team took it over in November 2011 and immediately started pursuing growth. You can see our click volume growing steadily (blue bars). But as volume grew, our previously solid conversion rate began to erode (red line). That’s not surprising—loss of quality is a risk you take when trying to drive volume—but I didn’t want to just accept a lower conversion rate, so we got ready to start testing.
In the spring of 2012, we were working on implementing ion interactive’s Liveball landing page platform and I was eager to get our campaigns transitioned into the new tool and begin testing. To do that, we had to split-test the new Liveball landing pages against our existing non-Liveball pages. I figured it was low-risk; early small-scale tests had been successful, we had built the Liveball pages to be similar to our existing creative, and we had checked off the best practices list as best we could. So we put up the new pages, directed half of our traffic to them and…disaster.
Not that everything fell apart all at once. The traffic we sent to the new pages converted 50% worse, but we get many conversions from repeat visitors, so we figured that they would catch up eventually (they didn’t). And, despite the awesome team at ion interactive, there were still a couple finer points of Liveball that we didn’t quite grasp, and they ended up creating uncertainty in our data. We knew at the time that things weren’t doing great, but in hindsight, they were terrible.
Now we had a decision to make. Things weren’t going well, but if we didn’t keep a good chunk of traffic flowing through our Liveball pages, we’d be stuck on our old, off-brand, static landing pages forever. Sure, our conversion rate might be better if we reverted in a panic, but we wouldn’t know why. So, we bit the bullet and started iterating* with Liveball. The next month (September 2012) was even worse, until the third week or so, when we started seeing things tick up. Ever since then, we’ve been on a tear. Some of the most significant results, which we achieved by testing things like calls-to-action, cross-sell opportunities, graphics and layouts:
- Conversion rate increased dramatically in Q4 2012, despite Q4 being a seasonally slow period for our business
- Jan-Feb 2013 conversion rate is +8% vs 2012, despite a 68% increase in traffic during the same period
- Jan-Feb 2013 conversion rate is +71% from May-July of 2012, the period right before we began testing aggressively
- Jan-Feb 2013 conversion rate is up over 90% from where we bottomed out in 2012
So there you have it—my education in the conversion rate optimization school of hard knocks. I got my ass handed to me before I learned anything meaningful and my team had to hustle to turn things around, but it was all worth it. We handled adversity, we proved that conversion optimization can make a difference, and we’re crushing it now.
If you’re going to get into the conversion rate optimization game, here’s my advice:
- Accept that ‘best practices’ are no more than a rough map
- Be ready to take short-term losses, but use them to learn things about your users
- Have the right tools. Sure, Liveball let me shoot myself in the foot, but it also allowed me to iterate and fix things quickly. Without Liveball, we’d still be seeing slow erosion, instead of a lovely up-and-to-the-right trajectory.
Good luck out there. It’s a hard knock life.
*I could write pages and pages on the specific tests we did, so I’ll save them for future posts.