Article

11min read

A Beginner’s Guide to A/B Testing your Emails

Email marketing is all about maximizing your open, click and response rates while generating as many leads and sales as possible for a given email campaign.

However, in our era of over-saturated email boxes, chances are your prospects won’t actually open your emails as they receive so many.

On average, MailChimp estimates that open rates vary from 18% to 28% depending on the industry concerned. While it’s not catastrophic, it still means that 75% to 80% of your emails will remain… unopened.

Let’s be honest: there is not a single magic formula to craft the perfect email. Otherwise, it would have largely spread over the internet and become overused in a matter of weeks.

The truth is, no one can really guess the perfect email campaign for your company – it will depend on a myriad of factors that we will cover later in this article.

As a consequence, the only way to design and write the most effective emails is to A/B test them.

Not just once, but many times.

By doing so, you’ll vastly increase your chances of uncovering magic tricks that will effectively increase your open, click-through and response rates.

Using email A/B testing, you’ll also discover what actually works on your prospects and how to address them.

Without further ado, let’s begin this guide by answering one simple question:

Why does email A/B testing matter?

Despite being one of the oldest online marketing channels, email marketing remains one of the top performing solutions to reach a broad audience and convert prospects into leads or clients.

More importantly, emailing is a marketing channel that is both:

  • Highly profitable
  • Often affordable
Return on investment of email compared to other channels
Sources: Neil Patel & EmailMarketingGold

As you can see, email marketing returns an average of $40 for every dollar spent, which is a massive improvement compared to display campaigns or banner ads for instance.

Knowing that email marketing is profitable, let’s see how email A/B testing will truly help your business:

It will improve your open and click-through rates

After a few A/B tests, your company should start to identify trends and common factors that lead to higher open and click-through rates.

This means that you will get more views but also more clicks to your website or online forms, which leads us to our second point.

It will increase conversions and generate revenues

Using a marketing automation software, you will be able to analyze your funnel and traffic sources, which is crucial to identifying how many opened emails actually resulted in leads or sales.

Knowing that, you will get a precise estimation of your email marketing ROI, which is a good start to further increase conversions and revenues.

From there, it’s up to you to conduct additional tests on your email campaigns in order to generate more revenues.

You will know what works for your audience

As we said in our introduction, not all industries are identical when it comes to email statistics.

Meanwhile, your prospects most likely have special needs and questions that need to be addressed in a specific way – which most marketers won’t be able to do on the first try.

After you’ve conducted a few conclusive tests, you’ll soon discover major differentiating factors that will account for your future email marketing campaigns success.

Using A/B tests, you’ll be able to craft tailor-made emails that will fit your prospects and generate more engagement.

You will save time and money

Although email marketing isn’t the most expensive online channel, it does cost a significant amount of money to send emails to a large audience and create adapted visuals, landing pages and forms.

Using email A/B tests, you’ll save time and money by quickly identifying the recipe for success in your given industry and by implementing incremental changes that will lead to better results.

What elements should I A/B test first in my emails?

At this point, you’re probably wondering how to set up a proper email A/B test and start gaining insights on what works and what doesn’t.

In order to help you do so, we’ve prepared a list of the 8 most important elements that could lead to significant improvements for your email campaigns.

Ready?

Subject & Preheader

A/B test email subject & preheader

Subjects lines and preheaders form the only touch point before an email is opened.

Therefore, they’re highly valuable items that require extensive attention despite their size.

Remember: your headlines and preheaders will determine whether or not your emails will be opened.

On average, optimal length for email subject lines is around 60-70 characters, no more.

You could try to tweak several parameters for your subject lines, including:

  • Word order (try reversing the order)
  • Tone (neutral, friendly, provocative)
  • Length (try shorter, try longer)
  • Personalization (try including their first name)

When it comes to preheaders, they’re usually pulled from the first line of your email. But as your email marketing senses sharpen, you could try to create intentional preheaders that most emailing tools now support.

If you can create your own preheaders, try to write complementary information and add relevant words that could trigger your prospects’ curiosity.

Different days and hours

For various reasons, email campaigns don’t perform the same depending on when you send them.

For starters, you could try to send emails on different days of the week: GetResponse reports that Tuesdays get the best open rates compared to the rest of the week, although the gap is relatively small (19.9% for Tuesdays compared to 16.9 on Saturdays).

Because studies can be biased and cultural differences can change this data, it’s important that you try different days in order to find what works best for your company.

Likewise, there are studies like MailChimp’s and HubSpot’s that tend to show a particular trend for optimal sending time around 10am to 11am.

Optimal sending time for your email campaigns
Source: MailChimp

Knowing this, you could try to adjust your campaign around different hours of the day just to see if one performs better than the others.

Length

The length of your email’s body can have a significant impact on your readers’ behavior, depending on what they have been used to.

With several studies all reporting serious decreases in our attention span, it may be worth deleting one or two paragraphs just to see if your email performs better.

One general piece of advice is to be straightforward and cut out the unnecessary, overused commercial taglines.

Of course, your emails’ ideal body length will mostly depend on your prospects’ expectations and your industry’s emailing practices.

In the fashion industry, the trend is moving towards flashy, punchy visuals with minimal copy that often features a very basic call-to-action.

On the contrary, B2B emails can purposely be long and feature bullet lists as well as multiple call-to-actions.

Visuals

Since our brain just loves visuals (read full study here), adding engaging visuals to your emails can be a very powerful tool to generate more engagement from your readers.

Add engaging visuals to your emails campaigns
House of Fraser, source: PiktoChart

Similarly to body length, visuals won’t show the same efficiency in all industries.

In fact, adding too many visuals can distract readers from the core message which often leads to having your call-to-actions ignored.

If you want to get a clear idea on whether or not images are adapted to your email marketing efforts, just try to run a Version A with no visuals (but the same subject line, body and CTAs) versus a Version B that contains visuals: you’ll see which one performs better.

Getting more personal

Adopting a friendlier, more casual tone and copy can often transform the way your readers perceive your email activities.

Using most recent emailing tools, you can dynamically add first and last names inside your emails: this will create a sense of personalization that most people like.

The copy

While there is no secret recipe to writing perfect copy (because it depends on your objectives), try running different versions through A/B tests while only changing the copy: this could lead to tremendous changes for your conversion rate.

If you’ve formulated different hypotheses about your readers’ expectations, create two different copies based on anticipated behaviors and send them to the same mailing list to see which one outperforms the other.

Call-to-actions & buttons

Whether they’re hypertext, images or buttons, your CTAs’ design and copy can have serious consequences on your readers’ likeliness to click them.

If you want to conduct in-depth CTAs A/B testing, try to compare different colors and formats to see if one stands out from the rest.

If that doesn’t deliver statistically significant results, you could try to change your value proposition; i.e the offer behind your call-to-action.

The best practices for email A/B testing

Now that we covered the main elements that can be tested through email A/B testing, let’s have a quick look at the 4 best practices to bear in mind before running email A/B tests.

Having a goal in mind

Defining objectives prior to running any A/B tests is a massive time-saver for any marketer.

In fact, it’s highly important that we as marketers formulate hypotheses based on the data we exploit.

  • You need to increase the open rate: In this case, you should mainly focus on your subject lines and preheaders: these are the two main elements that will affect this metric.
  • You need to increase your click-through-rate, downloads or subscriptions: If you want to increase engagement, then test all body-related content such as the copy, the tone, the visuals and the call-to-actions as they may all trigger an increase in clicks, subscriptions or purchases.

One vs Multiple Variables Testing

When it comes to A/B testing, adding multiple variables in your tests means that you will need an ever-increasing sample size in order to get statistically relevant results.

Besides, comparing two versions with multiple variants each will make it difficult for you to get relevant results as you won’t know which element triggered an increase or a decrease for your key metric.

If you have a small sample size, our general advice is to test one variable at a time.

However, there are cases where you will want to A/B test two completely different versions of your email: you can do so easily as the “winner” could be used for future benchmarks or as a template for your next A/B tests.

Testing at the same time vs testing at different times

Although you can absolutely A/B test your emails based on sending days and hours, try to avoid sending variants at different times: you won’t know if the changes were caused by the time or the email content.

Tracking results and building on your findings

Running email A/B tests makes no sense if you don’t actively track your campaigns results afterwards.

There are 4 main metrics that should you track in order to measure success:

  • Open Rate
  • Click-through Rate
  • Response Rate
  • Subsequent Conversion Rate

For most campaigns, open rates and click-through rates will be your basic performance indicators and you should track any sensible change, be it positive or negative.

On certain campaigns (namely lead generation and ecommerce promotional offers), you’ll also want to actively track the conversion rate associated with your call-to-action.

Simply put, you should track sales or the number of forms completed on your website derived from your email analytics in order to measure your overall return on investment.

In these scenarios, you’ll be tracking real conversions instead of the number of opened emails which will provide you with much more tangible data for your marketing analysis.

Did you like this article? Feel free to share and check out our other in-depth articles on how to optimize your website, ecommerce and digital marketing.

Subscribe to
our Newsletter

bloc Newsletter EN

We will process and store your personal data to send you communications as described in our  Privacy Policy.

Article

7min read

How to A/B Test Without Jeopardizing your SEO Efforts

A/B testing is an effective way to improve your site’s user experience and its ability to convert users to clients.

While changes made to your site may impact your user’s behavior, they are also seen by search engine crawlers, especially Google. The latter is perfectly capable of interpreting JavaScript, the scripting technology behind a lot of A/B tests.

As A/B testing experts, we are often asked about the impact of A/B testing on our clients’ organic search rankings. If SEO is not taken into account, an A/B testing campaign can impact the visibility of the site, notably for tests based on URL redirects.

This post is a good opportunity to review A/B testing best practices for SEO and help you do what’s best when it comes to optimizing conversions, without jeopardizing your rankings and web traffic.

General SEO recommendations

To start, let’s review some general recommendations from Google.

Google completely accepts A/B testing and even encourages it if it’s geared towards improving user experience. Google also offers its own client-side A/B testing tool (Google Optimize) that uses JavaScript to manipulate the DOM (Document Object Model) to create page variations.

On its blog, Google shares rules to be respected so that its algorithms do not penalize your site. The main rule concerns opening your test to the search engine’s robots, who must navigate on the same version of your pages as your visitors.

So, one of the first best practices for SEO is to not exclude Google’s bot from your A/B tests. Even if your A/B testing solution offers some advanced user-targeting capabilities, like user-agent detection, do not use them to exclude Googlebot.

It is also recommended that you do not display pages that are too different from one another to your users. For one, it will be more difficult to identify which element(s) had a greater impact on the conversion rate. Second, Google may consider the two versions to be different and to interpret that action as a manipulation attempt. Losing ranking may result or, worst case scenario, your site may be completely removed.

Depending on your objectives, the A/B testing setup may differ and each way of doing things can have an impact on SEO.

Best practices for A/B tests with URL redirects

A/B testing using URL redirects, also known as split testing, is one of these methods. Instead of using a WYSIWYG (What You See Is What You Get) editor to design your variation, you redirect users to a completely separate page, often hosted on your site, that has its own URL. Using this method is justified if you have a lot of changes to make on your page; for example, when you want to test a different design or another landing page concept.

This use case is the most prone to error and can have a dramatic impact on your search engine ranking, namely your original page being removed from the Google index, and replaced by your variant page. To avoid this, remember the following points:

  • Never block Google’s bots via your site’s robots.txt file with the Disallow instruction or by adding the noindex command on your alternate pages. The first prevents bots from reading the content of targeted pages, whereas the latter prevents them from adding the pages to Google’s index. It’s a common error, as the site publisher is afraid that the alternate version will appear in results. If you respect the following instructions, there is no reason for your alternate version to “rank” instead of your original version.
  • Place a canonical attribute on the variant page and set the value to the original page. This tells Google the original page is the one it must take into account and offer to internet users. Search engine bots will understand that page B has no added value compared to A, which is the only version to be indexed. In the case of a test on a set of pages (e.g. you want to test 2 product page formats across your catalog), you must set up this matching for each page.
  • Redirect visitors via a 302 or JavaScript redirection, both of which Google interprets as temporary redirects. In other words, the search engine considers it to be a temporary modification of your site and does not modify its index accordingly.
  • When a redirect test is completed, you must put into production the changes that have been shown to be useful. The original page A is then modified to include the new elements that foster conversion. Page B, meanwhile, can either be redirected to page A with a 301 (permanent) or 302 (temporary, if the page will be used for other tests) redirection.

Best practices for standard A/B tests

Applying a JavaScript overlay is by far the most common way to conduct A/B tests. In this case, your variants are no more or less than changes applied on the fly when the page loads into the user’s browser. The A/B testing solution manages the whole process from the JavaScript code interpretation of changes you made via a graphics editor, up to data collection, by randomly assigning users to one of the variants and respecting this assignment throughout the test. In this case, your URLs do not change and changes only occur in the client browser (Chrome, Firefox, Internet Explorer, etc.).

This type of A/B test does not harm your SEO efforts. While Google is perfectly capable of understanding JavaScript code, these changes will not be a problem if you do not try to trick it by showing it an initial content that is very different from that presented to users. Therefore, make sure that:

  • The number of elements called by the overlay is limited given the overall page and that the test does not overhaul the page’s structure or content.
  • The overlays do not delete or hide elements that are important for the page’s ranking and improve its legitimacy in the eyes of Google (text areas, title, images, internal links, etc.).
  • Only run the experiment as long as necessary. Google knows that the time required for a test will vary depending on how much traffic the tested page gets, but says you should avoid running tests for an unnecessarily long time as they may interpret this as an attempt to deceive, especially if you’re serving one content variant to a large percentage of your users.

Tips:
While it’s better to avoid overlay phases that are too heavy on pages generating traffic, you have complete freedom for pages that Google’s bots do not browse or that do not have an SEO benefit (account or basket pages, purchase tunnel pages, etc.). Don’t hesitate to test new optimizations on these pages that are key to your conversion rate!

What about mobile SEO?

Using your A/B testing solution to improve the user journey on mobile devices is a use case that we sometimes encounter. This is a particularly sensitive point for SEO since Google is rolling out its Mobile First Indexing.

Until now, Google’s ranking algorithm was based primarily on the content of a site’s desktop version to position it in both desktop and mobile search results. With the Mobile First Indexing algorithm, Google is switching this logic around: the search engine will now use the mobile page’s content as a ranking signal rather than the desktop version, no matter what the device.

Therefore, it’s particularly important to not remove from mobile navigation – for UX reasons – elements that are vital to SEO, like, for example, removing page-top content that takes up too much space on a smartphone.

Can personalization impact your SEO?

Some A/B testing tools also offer user personalization capabilities. AB Tasty, for example, helps you boost user engagement via custom scenarios. Depending on your visitors’ profile or their journeys on your website, you can easily offer them messages or a personalized browsing experience that is more likely to help them convert.

Can these practices have an impact on your SEO? Like for A/B tests using JavaScript, impact from SEO is limited but some special cases should be taken into consideration.

For instance, highlighting customized content with an interstitial (pop-in) presents a challenge in terms of SEO, notably on mobile. Since January 2017, Google considers it to be harmful to the user experience since the page’s content is not easily accessible. So personalized interstitials must be adjusted to Google’s expectations. Otherwise, you take the risk of seeing your site lose ranking and the resulting traffic.

Note that Google seems to tolerate legal interstitials that take up a majority of the screen (cookie information, age verification, etc.) for which there is no SEO impact.

To learn more, download your free copy of our A/B testing 101 ebook.