This was just too awesome not to share. It’s got the good and bad way to do conversion rate optimisation spot on. It’s looking at a more broad view than page level, the path that a typical user takes to convert.
If you want your website to be successful, conversion rate optimisation is crucial to success. Just half a percent increase can make a huge difference in the volume of leads your site generates, and more importantly – your ROAS. The effort-reward ratio certainly means you should put aside some time to test. I’ll be referring to multi-variate testing throughout the post. An A/B test if for when 2 pages are very different, rather than small changes on a page.
Identifying areas to test
Start off by splitting a page into sections. Each section usually has a goal such as explaining the product, how a visitor gets the product, what benefits / features a product has.
We’ll take the homepage of Graze.com as an example. Take a look at their website, try split it up in your mind and figure out what the aim of each section is.
Fingers crossed you and I have split the homepage into 6 sections. Each are designed specifically to answer a question, or drive the visitor onto the conversion path.
- Cyan – Points to the main areas of the site. Make note of the ‘FIRST BOX HALF PRICE’.
- Blue – What the product does. Eat natural/healthy and have less biscuits.
- Green – Selling points of the product. Quality, quantity, information and delivery.
- Yellow – Call to actions. Pointing the user to essential information. What and how.
- Red – Direct benefits of having the product.
- Orange – Testimonials from big brands
Graze.com have obviously ran experiments to improve their homepage. I cannot see any sections standing out that are wishy-washy or in need of attention. But by identifying the purpose of each section, it’s easier to figure out what they have tested and what to try to test. Here are a few things you could test.
- Cyan – Different wording like ‘50% off first box’ could be tested.
- Green – A new image about choice in your box. Choose nut-free, favourite foods, etc.
- Yellow – As above, a different offer message could be tested. or change ‘get started now’ to ‘pick your box’
- Red – Rotating different benefits could help.
- Orange – The types of brands in this section could be rotated. For example, if health brands all give testimony, the product may appeal to ‘health buffs’. Or if the themes were big tech companies, it might appeal to ‘tech buffs’.
Don’t test more than your site can handle
We’ve identified the sections of a page, and what we can change. Now we need to decide how much to test at once.
The complexity of your testing will depend on how many visitors your site receives. For low volume sites, it’s best to test less variations, as it will take a while for meaningful data to accumulate. For high volume sites (greater than 1,000 visitors a day), you can happily test many aspects of a page, as within a few days you might find a winning combination.
So, what are you waiting for?
If you have any comments or questions, feel free to use the comment box!
Often on low traffic sites, you’ll be waiting quite a long time for a split test to show good results. And by the time you get a decent level of numbers, will the trend have changed?
So it’s time to stop looking at just visitors and conversions and have a look at the other metrics available to you.
So, if one of your experiments is looking a little like this:
Its time to start looking at this:
So, have both open, set Analytics to the date range that matches the start/end of the experiment and start comparing performance.
Average time on page
Which page are users staying on the longest? If a variation’s time on page is significantly higher, is the content better structured, better wrote or cleaner? A higher time on page is a positive degree of engagement.
How many people came to your site, didn’t like the first thing they saw and then run out crying. This is a clear cut way of seeing which variation people marmited. A lower bounce rate meant people like what they saw, and stuck around because of that. Lower bounce rate is a positive degree of engagement.
Depending on the purpose of your site, this is either a good or a bad thing.
If the experiment page is meant to be the last page people see on your site, like a page that encourages the user to click on external links, then a high exit rate might be exactly what you want.
If the experiment page needs to keep the visitor on your site, then a lower exit rate might be what you want to see.
Segment Cross Contamination
Sometimes you will need to dig deeper and further segment the visitors who partook in the experiment. Have a look and see how the following segments compare:
- New vs Returning visitors
- Paid vs non paid visitors
- Direct, Referral and Search Traffic
Avinash Kaushik has a great article on applying statistical significance in A/B testing. I’d advise you to read it!
Any questions or comments, put it below!
Thank you for reading.
So, this is the first blog post in a long while and I’m writing it on my new HTC desire!
So, let’s get to the point. You are setting up a multivariant (or a/b) experiment using web optimization software, you also have Google analytics installed (though any analytics with event tracking will do).
To better understand how the elements you’re testing affect users you will need to place event tracking on each of variants. You will then be able to set up segments in your analytics package to see how they behave.
Because your not dummies, here’s a brief how to.
Step 1: set up your variants
Step 2: place event tracking inside script tags on each variant, it should execute on pageload of you do this
Step 3: set up an advanced segment for each of your variants
Step 4: analyse!
Example tracking parameters:
category = experiment
action = experiment name
opt_label = variation name
What to analyse?
Average time on page
What pages they visited next
Goal Conversion rates
Any questions? Not the nicest formatted post, but oh well!
So, we had some hyperlinked images on one of our websites, each leading to the same page, but with a different $_GET which identifies which button this was. As far as my knowledge goes, that would have made the page duplicate (in google robot’s eyes), which then would have a minor impact on the organic page rankings.
And because organic page ranking is quite important, I thought I’d do some digging regarding to tracking hyperlinks with Google Analytics. I found lots of information regarding to tracking form buttons, flash files and allsorts but most were crowded with extra information.
The line of code that tracks this event is quite simple, and all you need to do is add it into your <a> or your <input type=’button’>
onClick=”pageTracker._trackEvent(category, action, optional_label, optional_value)”
Category in this case was the general name for the event which happened to be ‘Buttons’.
Action in this case was what they’d done, so I named it ‘Click’.
Optional_Label in this case wa a numerical indicator dependant on which button was clicked.
Optional Value I didn’t acually use, as I didn’t see any point in assigning the button a value, as each was equal as each other.
It should look something like this:
<a onclick="”pageTracker._trackEvent('buttons'," href="/page.htm">Link text here</a>
Sorted! Just log into your Google Analytics account, go to the website report -> Content -> Event Tracking and within a few days you’ll see some numbers 🙂
Need a hand? Just post a comment!