When you hear ‘A/B Testing’, do you think revenue gain? David Mannheim of User Conversion, argues that you probably do – and shouldn’t.
AB TASTY BLOG
Statistical significance is a widely-used concept in statistical hypothesis testing. It indicates the probability that the difference or observed relationship between a variation and a control isn’t due to chance.
Even though hypothesis tests are meant to be reliable, there are two types of errors that can occur. These errors are known as type 1 and type 2 errors. Learn more.
A/B testing has been around for decades, even before the advent of the internet or social media. In fact, it … Read more
Headline testing consists in creating several title variations for the same article in order to find out which one performs the best in terms of shares, average reading time, click-through rate, bounce rate…
Using A/B tests in your email campaigns, you’ll be able to craft tailor-made emails that will fit your prospects and generate more engagement. Learn more.
Learn how to effectively run an A/B test without ruining your traffic acquisition efforts and impacting your organic search rankings.
Follow this step-by-step guide to learn how to run A/B test on your landing page with tips from CRO experts.
It’s not always easy to know which kind of test to use to optimize your website. This quick read will go over the pros and cons of multivariate testing, so you can decide if it’s the best pick for you.
Analytics and A/B testing: It’s a match made in data heaven. Just like Tom and Jerry, Simon and Garfunkel, or … Read more
Not many people have heard of A/A testing, but those that have know it can cause a lot of debate: is it a useful best practice or a waste of time? This article takes a look.
Establishing insightful, effective A/B test hypotheses is half the battle of conversion rate optimization. We’ll show you how.
We’re enriching our conversion rate optimization platform with a server-side A/B testing solution! Now you can get even more creative with your A/B tests. Read our article to see how.
Flickering, also called FOOC (Flash of Original Content), is when an original page is briefly displayed before the alternative appears during testing. The good news is that there are several best practices to effectively mask the flickering effect.
Note: This article was written by Hubert Wassner, Chief Data Scientist at AB Tasty. Some of you may have noticed … Read more
This is my personal response to a study done by ConversionXL. Rémi Aubert, CEO at AB Tasty. Context On May 18th, … Read more
How exactly can you optimize your conversions with A/B Testing? Learn lessons from companies that have emerged as shining examples of A/B testing genius.
In 2014, retail e-commerce sales during the Christmas holiday season accounted for23.4% of total annual retail e-commerce revenue and 50% … Read more
With over five years’ experience, AB Tasty’s teams have witnessed the implementation of thousands of tests, optimization ideas and ways … Read more
In digital marketing, you should always aim for the best user experience. Don’t put up with status quo – every … Read more
Companies often opt for one or the other of these tests without realizing the benefits of using them together. Given … Read more
How long should an A/B test run before you can draw conclusions from it? At what point can you end a test that appears to be yielding results?
The independent business software review site TrustRadius publishes today its very first Buyer’s Guide to A/B Testing Software, in which … Read more
If you’re familiar with A/B testing, you know that you should base it on data. However, this quantitative statistical data is just one part of the equation. In order to get a true understanding of the behaviour of your users and choose the most relevant tests to set up, you must use qualitative data.