Categories
Digital Presence & Marketing Strategy

When In Doubt, Test It Out!

I know, corny title but it’s true – if you are wondering about the effectiveness of a particular strategy, test, test, test. One of my mentors who sells memory foam mattress products is infamous for this – he’s always testing, trying, evaluating, and then re-testing to see what gets the best response over time.

Over the past few years I’ve worked with hundreds of clients in a variety of industries. Surprisingly, all of them want to see results and make sales 🙂 I’ve learned through my own testing that there are certain things that work consistently; other things may work great for my industry or what I’m doing, but they may not get the same response for a client doing a teleseminar on getting free advertising on the radio or whatever other ideas you might have.

I recently found a cool post while twittering. It is titled “Do Buttons Get Clicked More Than Text Links? A Case Study” by Justin Premick. It talks all about testing in regards to an email campaign. Justin and his partner Marc wanted to determine

…how to increase clickthroughs on the emails we send to our blog subscribers.

One of the ideas that came up was to replace the text links that we had been using to drive people to the blog with a “button.”

Previous testing on the website had shown that in many cases, buttons make better calls to action than text links do. We thought the same might hold true for email. (italics and bold added for emphasis)

So, Marc created a button-shaped image with the words “Read More” stamped on it:

We then created A/B split tests for our Blog Broadcasts, inserted this image into one version as the call to action (to read the full post on our blog) and continued to use text links in the other version as we had before.

The emails were otherwise identical – we kept subject lines, sending dates/times and templates the same for each version.

They had a question, they thought about things they’d done in the past that had worked, and they began the test.

I like that they pointed out the A/B testing: this is one of the only ways to really tell for sure which of two ideas is going to work best. Otherwise too many factors may contribute to the outcome. Like they said, besides the text link and button, “the emails were otherwise identical”.

They go on to talk about initial results:

As we expected, the button grabbed readers – attention and incentived them to click through, much better than the text link did…At this point, about 2 weeks into our test, it was tempting to say, “The button clearly draws more attention and clicks than text links. Let’s just start using buttons and move on to another test.”

Did they stop there? Nope!

We ultimately ran the button-versus-text split test about 40 times, over the course of several months.

For a while, the button continued to beat the text links – but we noticed that it wasn’t doing so by as large a margin as it first had.

While over our first five tests, the button beat the text by over 33%, after 20 tests it was only winning by an average of 17.29%, and the text version was beginning to hold its own in the win column.

With each new split test, the text asserted itself as the better call to action.

By the time we ended our experiment, text links were consistently outperforming our button, winning nearly two-thirds of the time, by double-digit margins as high as nearly 35%.

What can we learn from this? How does this apply to YOUR website? How do YOU plan to implement this testing case study?

To sum it up best, let’s read on to see what conclusion they came up with:

What works today may not work tomorrow.

Had we stopped our testing after one broadcast, or even one or two weeks, we would have concluded that buttons were better than text links.

It’s important to continually test your email campaigns to make sure that you know what works, rather than assuming you know what works.

Finally, one last point I feel obligated to make:

What works for someone else may not work for you.

The text links won out in our split test, but that doesn’t mean a button can’t be an effective call to action for you.

Again, don’t just take our word for it. Find out for yourself through your own testing.

Running an effective business, both on and offline, takes analysis like this if you plan to see the long term results you want to see. I am currently doing a test right now with a client email campaign and will let you all know the results in the upcoming weeks.

One reply on “When In Doubt, Test It Out!”

Leave a Reply

Your email address will not be published. Required fields are marked *