This article follows on from the post dedicated to the differences between Google Content Experiments and more sophisticated A/B testing tools such as AB Tasty (what is A/B testing?). Having dealt with the entirely different approaches adopted by the two tools, as much with respect to the way tests are created as to the potential test scenarios, it is now time to take a look at aspects relating to the tracking of objectives and analysis of the results.
The convenience of using Google Content Experiments stems from the fact that it integrates completely with Google Analytics. This means that selecting the objective to monitor – the one by which you will compare performance differences between variations – only requires one or two clicks. All you need to do is choose, from a drop-down menu, an existing objective from amongst those defined at initial configuration of Google Analytics. There is, therefore, a strong chance that you will not need to carry out any configuration. It could hardly be more simple. If, however, you have not configured your objectives, Google Content Experiments will invite you to do it. The objective you define will subsequently become available globally in Google Analytics’ interface. The integration is perfect.
Unfortunately, Google Content Experiments will only let you choose a single objective. It is not possible for you to specify several indicators to monitor. However – and even though it is always recommended to define a primary KPI that will allow you to decide between your variations – it is often interesting to compare the variations using other indicators.
The first reason for this stems from the fact that, frequently, an A/B test does not have a direct impact on your primary objective (we also talk in terms of ‘macro-conversion’).
Cases where changing the color of a button has a positive impact on your global conversion rate (number of buyers/number of visitors) are quite rare, except when you are starting from a point where you have a lot to do and your site has major ergonomic problems, such as a hard to spot action button.
But even if your modification does not have an impact on this indicator, it may have a positive effect on another indicator, such as your add-to-cart rate. This is what we call a micro-conversion: a first step, undeniably a less significant one, on the path towards macro-conversion. How can you measure this effect if you do not monitor this KPI as part of your test?
The second reason for using other indicators stems from the fact that sooner or later you will want to know the impact of your modifications on different indicators in order to better analyze your tests’ results and arrive at your decisions with full knowledge of the facts. If your only criterion for comparison is the add-to-cart rate, as in our preceding example, where does that leave the view rate for the web visitor’s cart summary page, a page that very often contains the famous “Confirm my order” button, which enables the visitor to enter the conversion funnel? Your modifications have produced an improvement in the add-to-cart rate – that’s a good start – but have they, at the same time, negatively affected this second, equally important indicator? With just one objective to track, you do not have all the information required to draw conclusions from your tests. Yet the ultimate aim of A/B testing is precisely that of helping you to compare and contrast and to make your decisions.
Most A/B testing solutions of the AB Tasty kind, therefore, allow you to specify several indicators to monitor, which you then use to compare your variations.
Configuring these indicators is just a simple as with Google Analytics, and very often there are four types of objective available:
- Page-viewed objective: The web visitor is considered converted once they view a particular page (e.g. /thank-you-for-your-registration.php)
- Event objective: A conversion is recorded when the web visitor carries out an action such as clicking on a button, watching a video, downloading a document, etc.
- Engagement objective: You wish to compare your variations based on indicators such as bounce rate, time spent on the site, the number of pages viewed, etc.
- Transaction objective: The web visitor must complete a transaction for this type of conversion to be recorded. A page-viewed type objective can be used when you are only interested in the number of transactions (e.g. thank-you-for-your-purchase.php). However, A/B testing solutions offer more advanced implementations (the addition of a specific tag to the order confirmation page) that allow you to collect data associated with these transactions and compare your variations using other indicators (e.g. average cart value, visit value, etc.).
This last feature means these solutions can compete with Google’s product and its promise of integration with Google Analytics. One of the advantages of Google Content Experiments is essentially its use of the E-commerce implementation already in place on your site to feed this information into your reporting. With E-commerce integrated A/B testing solutions, you can achieve the same results without too much effort, indeed with no effort at all if you use AB Tasty in conjunction with an E-commerce solution such as Prestashop or Magento. Plug-ins are available for these platforms. They take care of adding the AB Tasty conversion tag to the correct pages and with the correct information.
Similarly, with AB Tasty it is possible to send additional information to the tag in order to feed any type of data back to your reporting (e.g. sociodemographic data, data from your CRM/back-office such as a segmentation specific to your company, etc.). You can then use this information for various purposes, be that targeting your tests, filtering the results they produce, etc. This last possibility increases your opportunities for analysis whilst at the same time providing you with valuable marketing insights. If you identify that a message conveyed by one of your variations works better than the original with respect to a sub-segment of your audience, you can make use of this information to refine your strategies with respect to traffic acquisition, merchandising, promotional activities, or even communications. This is information of high added value that can be reused by other departments in your company.
The cherry on the cake? You can also link your tests directly to… Google Analytics. Once the integration has been established – it just involves activating a simple option in the AB Tasty interface – you will be able, directly from your favorite web analytics tool’s interface, to add an additional segmentation dimension to your indicators that corresponds to the variation the web visitors have been subjected to. This works with all the indicators fed into the Google Analytics interface: visits, pages viewed, objectives, conversion rate per objective, transactions, E-commerce conversion rate, etc. You can also add this dimension to customized reports if you need to develop more complex reporting.
The only limit to this integration is that no statistical significance calculator are displayed in Google Analytics (are the differences observed down to chance, or are they significant relative to the sample size?). You will need to reprocess the raw data and use it to calculate these indicators yourself. By contrast, Google Content Experiments and AB Tasty provide these indicators natively so that they can be rapidly accessed and results are easier to interpret.
This integration with Google Analytics thus provides a complementary operating mode which, though it can meet certain needs, does not replace true A/B test reporting as provided by AB Tasty, with its multiple objectives, its result filtering possibilities and its statistical reliability indicators.
And that draws our review of the two tools to a close. Google Content Experiments offers a free solution that is completely integrated with Google Analytics but aimed at people who can easily edit their pages’ source code and who want to carry out simple tests. The absence of targeting for these tests is a real limitation for advanced users, and the objective-tracking possibilities prove too limited when users want to analyze their test results in detail for decision making purposes. These are all points addressed by a complete A/B testing software like AB Tasty.
To discover the differences between other A/B testing tools, please request a personal demonstration of AB Tasty. One of our consultants will guide you through the tool’s different features.