How to run A/B tests using Google Content Experiments

in Strategy   —  

Overview

Google Content Experiments uses a slight different approach than many standalone A/B and multivariate test platforms. It uses an A/B/N model. With this approach, you're not testing just two versions of a page (as in A/B testing)  and you're not testing various combinations of elements (usually text, images, or button colours) on a single page (as in multivariate testing). Instead, you are testing up to 10 full versions of a single page.

With Content Experiments, you can:

  • Compare how different web pages perform using a random sample of your users
  • Define what percentage of your users are included in the experiment
  • Choose which objective you’d like to test
  • Get updates by email about how your experiment is performing

Benefits from Google Analytics integration

Google Content Experiments is a free service which is integrated with Google Analytics.

This allows you to run a set of experiments and compare the results using any existing goal or ecommerce metric that you already have setup in your Google Analytics account.

Furthermore you can also apply your experiment on all website traffic, or you can use the Google segment builder to define exactly which users you want to test on. A simple example might be to repositioning a newsletter signup but only target this at first time visitors.

Decide what to measure

Before running any experiment, it is important that you have the metrics in place to accurately measure what is currently working (or not) and the potential benefits of any variations.

For a landing page, you may wish to measure simple metrics like bounce rate and clicks against the primary call to action. For content sites, you may wish to find out the best placement for share buttons or a newsletter signup.

For an ecommerce site, it is a bit more complicated. You may have a number of hypotheses to test around product pages, related product pages and button placement, but the best measure is still the average order value per session as ultimately that drives revenue.

How to setup the experiment

To create a new experiment, open Google Analytics, navigate to the Behaviour section and click on the Experiments link. This page will show any existing experiments.

Click on the “Create experiment” button. You will see the following screen

You will be asked the following information:

  • Name for this experiment
  • Objective for this experiment. This can be any existing goal (including ecommerce metrics) for which you want to improve conversion.
  • Percentage of traffic to experiment. The higher the percentage of traffic you select (100%) then the sooner you will see statistically relevant results.
  • Email notifications. This is highly recommended
  • Distribute traffic evenly across all variations. If you enable this option, Google will assign an equal amount of traffic to each variation for the life of the experiment. When this option is disabled (recommended) Content Experiments will instead use the default behavior of dynamically adjusting traffic based on the variation performance. This approach is known as a ‘Multi-armed Bandit’ experiment and is discussed further below.
  • Set a minimum time the experiment will run. We suggest running experiments for a minimum of 2 weeks to account for different behavior patterns on weekends and weekdays.
  • Set a confidence threshold. The higher the threshold, the more statistically accurate the result, however this also means the experiments will take longer to finish.

On the following page, you will be asked to define your variations.

Google expects to use different URLs to test separate page variation, for example “/page-a” vs. “/page-b”. This is probably the simplest way to setup an experiment, and is most useful when testing landing pages or category pages which can quickly be duplicated and edited within a CMS.

With other pages, such as ecommerce product pages, you will need to involve your development team to define an extra URL parameter which will trigger the different layouts for the page template, for example “/page#version=a” vs. “/page#version=b”

Finally, once you have entered the URLs for your different variations, Google will generate a set of JavaScript code to run the experiment which will need to be placed on the relevant pages (or can be inserted via Google tag manager).

You can find detailed instructions in the Analytics help center

Understanding ‘multi-armed bandit’ experiments

By default, Google uses what is termed a ‘multi-armed bandit’ approach to running variations. This uses statistical modelling to reduce the amount of time that an experiment needs to run before results are determined.

With the ‘multi-armed bandit’ approach, the Google algorithm reviews the experiment every 12 hours and automatically adjusts the amount of traffic each variation receives. Once there is a large enough set of data, a variation that appears to be doing well is given more traffic, and a variation that is clearly under-performing is given less.

This makes the experiment more efficient because it incrementally pushes traffic towards potentially winning variations instead of waiting for the "final answer" at the end of an experiment. It also makes the experiment faster, because traffic that would have gone to under-performing variations is instead assigned to potential winners, allowing a conclusion to reached sooner.

You can find out more in the Analytics help center

A real world example: Graham & Brown

With the ecommerce website for Graham & Brown, we recently ran a simple experiment to establish the best position for a “customers also bought” block which was shown across all product pages.

With a new recommendation algorithm in place, we knew we could do a better job of getting users to view the set of related products -- be those complementary products, design variations or similar styles.

The aim was to see if we could increase the average order value per session, leading to a sales increase. As such, we wanted to know which potential page layout would generate the most clicks on the recommended products being suggested.

Firstly we defined a metric (and the code) to capture the number of clicks on the set of recommended products, before testing 3 variations against the original page layout.

You can see the results above.

As you can see variation 3 is the clear winner. This placement resulted in twice the number of views of related products (10% of users) when compared to the original position (5% of users).

We also measured the same variations against the basket value per session (pausing all other experiments for this time) to ensure we generated a corresponding increase in the average basket value per session.

Takeways

With Google Content Experiments, and a little effort, it becomes possible to measure and increase user engagement and conversion.

Being based on JavaScript and therefore a relatively unobtrusive technology, Google Content Experiments allows ideas and hypotheses to be relatively quickly tested, against a sample of users, before a winning variation is incorporated into the site permanently.

A/B testing forms just part of the business strategy and digital optimisation services that we provide to help companies improve their conversion rates and increase revenue.

Read next

ArticleRetail

How retailers can boost conversion, part 2: predictive search

How to extend your search function to ensure your customers can get to relevant products as quickly as possible.

ArticleRetail

How retailers can boost conversion, part 1: landing pages

How landing pages can improve the shopping experience and drive sales.

ArticleExperience

The ecommerce mistakes that Austin Reed should have avoided

On 26 April, 116-year-old, British tailoring brand Austin Reed collapsed into administration. In response to the news, our customer experience team decided to review Austin Reed’s online presence and contrast this with the results we've obtained for our clients.