This article arises from a question we were asked several times at the last trade fair we attended: why use an  A/B testing software you have to pay for rather than a free solution like Google Content Experiments (formerly Google Website Optimizer) that integrates natively with Google Analytics?

With the search engine already offering an excellent free web analytics tool, and one that is easy to get to grips with (at least in appearance if you only scratch the surface), the a/b testing tool that accompanies it ought to be a must have!

Of course, this is an assumption that is a little too easy to make if you have never tested and compared the two tools. The purpose of this article is therefore to do that for you, and to present to you, by means of the facts, the differences there are between the two tools.

Though the ultimate objective remains the same – help website owners compare the performances of several versions of a page to improve their conversion rate (read the definition of a/b test) – we are going to see that the methods and means of achieving that are very different. You will be able to judge the suitability of each tool to your A/B testing needs and circumstances for yourself.

Technical implementation

As soon as the two tools have been technically implemented, you will notice major differences.

With Google Content Experiments you must install, for each new test, a test-specific tag on your original page – the page we call the control. In other words, you have to retag your pages at each new test. If you are able to work independently and you have access to your webpages’ code, this is not too much of a problem. But if this is not the case, you will need to ask your technical team or your service provider to make this alteration on a regular basis, and that’s without taking into account the need to remove the tags at the end of the tests. Maintaining the code for your own webpages will therefore require additional effort.

tag-google-content-experiment
Different Google Content Experiments tags to install for each new test

By contrast, with AB Tasty you have just a single tag to install across all your pages, and this is something you only have to do once. It is essentially a unique tag that will permit you to launch all your future tests. Once the tag has been installed, therefore, you no longer need to worry about it, and your technical team will thank you for that. This also means their involvement will be reduced to next to nothing. This is of particular significance if you use an external service provider who does not necessarily provide you with a response in minutes, or, worse still, charges you for every little intervention.

tag-ab-tasty
A unique AB Tasty tag to use for all your tests

Hold on a minute! The AB Tasty tag has to be placed on all the site’s pages whereas with Google Content Experiments only the tested pages need to be tagged? This is a way of getting us to pay more to use AB Tasty! Not at all. Our billing model is based on the number of visitors tested. Whether a tested web visitor visits one page or ten, it makes no difference to your bill.

Defining the modifications

When you create a test with Google Content Experiments, one of the main steps involves specifying the URL of the page you want to test – the original page – and the URLs of the pages that will serve as variations. The tool will take care of redirecting your web visitors to one or other of these pages. As you will realize, this requires you to first create, develop and host these additional pages on your server yourself. Once again, your technical team will be heavily involved.

google-content-experiments-variations
Defining the URLs of the variations in Google Content Experiments

In the case of AB Tasty you do not need to develop new pages. The function of the tag you have placed in your pages is to load a JavaScript file that will be executed when your pages load and which will apply – client side, thus via your browser – the modifications you require. You define these modifications yourself using our WYSIWYG (What You See Is What You Get) editor: a GUI (Graphical User Interface) type application that displays your page and allows you to very easily edit it, however you like, with the help of drag and drop. You thus construct your variation as if you were using PowerPoint or Keynote. It is not necessary to know HTML, CSS or JavaScript code. Everything is achieved with the help of menus and predefined operations (e.g. modify an image by uploading a new file, etc.). However, if you know these languages, there is nothing to stop you using the advanced editing mode in order to code like a pro.

WYSIWYS editor to create your variations in AB Tasty
WYSIWYG editor to create your variations in AB Tasty

Here again we see the main difference between the two tools. Where Google Content Experiments requires considerable technical intervention and is therefore aimed more at technical specialists, AB Tasty is more targeted at marketing specialists who wish to work autonomously and make gains in terms of speed of implementation, without being dependent on their technical department’s schedules.

Note: AB Tasty also provides an operating mode similar to that of Google Content Experiments, where you specify the URL of an additional page to serve as a variation. This is what we term a test by redirection at AB Tasty. There is also nothing to prevent you from mixing the two operating modes within the same test (a variation created using the graphical editor and another that redirects to an existing URL).

Controlling traffic allocation

The two tools allow you to configure the percentage of your traffic which will be subjected to the test. If you are not feeling confident or you are concerned about the impact of your modifications, the test can be restricted to a portion of your traffic. With Google Content Experiments, the percentage is adjustable in predefined steps (1%, 5%, 10%, 25%, 50%, 75% and 100%), whereas with AB Tasty the user is free to define the percentage rate.

Management of the traffic allocated to a test in Google Content Experiments
Management of the traffic allocated to a test in Google Content Experiments
Possibility of managing the traffic allocated to each variation in AB Tasty
Possibility of managing the traffic allocated to each variation in AB Tasty

What is even more interesting, however, is that AB Tasty lets you allocate a different traffic percentage to each variation and change it during the test. You can therefore subject just a small part of your traffic to variation number one at the beginning of a test, then subsequently increase it. In all cases, results will be calculated in a way that enables them to be compared with samples of a similar size. With Google Content Experiments you are obliged to allocate traffic equally to each variation.

Test scenarios

When tests are launched, with Google Content Experiments, they target all your web visitors, whoever they are. You do not have the possibility of including or excluding particular web visitor segments. However, in many situations you will want to implement test scenarios that depend on different criteria, such as visitor origin, profile, behavior or the device used (smartphone, tablet or desktop computer). This is where tools such as AB Tasty prove themselves much more powerful, giving you the freedom to define very precise criteria to determine whether or not to subject a web visitor to a test. These criteria can even include CRM data (logged-in or non logged-in users; customers or prospects; gender; age; etc.). The scenario possibilities are therefore multiplied.

ciblage-tests-ab

In the next part of the article we will endeavor to compare the two tools in terms of other criteria, in particular those relating to the definition of objectives and the interpretation of results.