Conversion optimization: homepage CTA button part 1

homepage button conversion optimization

It’s crazy to think about all of the buzz words around data driven marketing. As a marketer learning data science, I have bought into the trend just as much as anyone. But lets skip on some of the hype and take a look at a specific test that we are doing which is really interesting.

First off, why the hell is data-driven-marketing exciting?

…Simple

Because you can test and measure simple tests that make a huge impact.

In some cases, we use data to personalize email marketing. In other cases, we use data to decide how we spend our marketing budget. In the end, how to track and measure is an art unto itself.

Conversion rate optimization, click-through rates, unique clicks, bounce rate… I can go on and on… and on… Lets be real here, with all this testing, shorter testing time-frames and imperfect control groups, marketers struggle to develop clear and concise results.

So, does that mean it’s not worth testing? No, but we need to be clear at the beginning about what the deciding factors are. This is where we create a story that involves our “hypothesis” and work through the different tests until we can dissect results. Our team wanted to test color theory in marketing – how colors affect our decision to click on a CTA or not.

What To Test

Question: What color button performs best for the ‘schedule a demo’ button on our homepage?

Hypothesis: If it is possible to find a button color that results in a better conversion rate (conversion being defined as a person who completely fills out the schedule a demo form OR our weekly demo form), I can therefore, change the button to that color and see an increase in conversions.

What counts as a verified result? We’ll look at the results of the test, if there looks to be a clear leader in the test, then we’ll update the homepage to that color (that will be Part 2 of this series). If our homepage sees an increase in demo conversions from the homepage after 3 weeks of measuring, then we’ll say that it was beneficial. Notice we’re NOT saying it’s a proven result, that’s simply too hard to prove given the amount of variables to test against. That being said we do have a ‘control’.

Controlling Scenarios

Our control is simple, we’ll use 4 colors: light blue, dark blue, orange and green. What works best for us will likely be a result of the color scheme which already exists on our site. We have a lot of blues with orange, red, and green accent colors.

Here are what the 4 tests look like

Version 1 ‘Original’

original homepage button for test

Version 2 ‘Orange’

homepage orange button test

Version 3 ‘Green’

green homepage button test

Version 4 ‘Dark Blue’

dark blue homepage button test

What you will notice is that each of these images is that are almost exactly the same. The only change is a slight adjustment in the color of the button in the code (a Bootstrap class switch using Optimizely’s JS script from btn-info, to btn-warning, etc… If you’re familiar to Bootstrap these are easy to define in your CSS). If you’re interested in learning how I can help you further shoot me an email.

More testing…

While we have seen more engagement with certain colors (click throughs), those have shown to be a somewhat inverse correlation with overall conversions. An interesting observation, but inconclusive so far. We are still measuring the results and we’ll publish the findings in part 2 of this series. Make sure you get an email when it’s announced by signing up for our newsletter.

Thoughts So Far

I’m a little pessimistic that this test will not yield meaningful results. It’s really tough to say that a simple color change can make that big of a difference. Especially with all of the factors that go into it.

Yet one of the beautiful things about these types of tests; they take hardly any time at all to set up and perform. So, if we run a lot of micro tests like this, who knows, we may learn something crazy.

One tidbit I noticed so far during our tests is that demo conversions were down slightly correlated almost exactly with the time I started the test. Correlation doesn’t equal causation, but damn, it’s worth checking out. Make sure you tune in for part 2. The link will be post here, so feel free to bookmark and save this page.

Ready to get this party started?

We help great companies, large and small, crush their email marketing goals.

Let’s Do This.

Interested in working together? Have a question?
Fill out the form, and we’ll be back to you, lickety-split.