Article

5min read

Unlocking Hidden Revenue – A/B Testing within Single Page Applications

If your organization is having trouble successfully running A/B tests in areas of your site where customers are going through the purchase flow, the issue may be due to Single Page Applications (SPAs) on your site. As customers move through the process, your A/B testing tool might not recognize their progress in an SPA environment.

There is enormous value in A/B testing critical areas of the web experience that are often operating in SPA environments, such as an eCommerce checkout. 

This guest blog post was written by Jason Boal, The Principal Analyst & Optimization Strategist at 33 Sticks – a leading American analytics agency. Let’s address this common issue and uncover ways to overcome this to unlock hidden revenue on your site.

1. What is a SPA and how can I tell if the web experience uses one?

In a Single-Page Application (SPA) environment, content is loaded dynamically without requiring a full page refresh or reload. User interactions occur on a single page, with new content being loaded as the user navigates. Gmail is a prime example of an SPA. At a high level, an SPA functions similarly to a standard client-server interaction, but the key difference lies in what is returned to the browser.

To determine if you are operating in a Single-Page Application (SPA) environment on your site, pay attention to whether the page reloads as you interact online. If you see the page load indicator—such as the spinning icon in the browser tab (in Chrome)—it means the page is reloading, and you are likely using a traditional multi-page application (MPA).

Many websites are hybrids, meaning that only certain sections, like the checkout process, function as an SPA. To find out which parts of your site are SPAs, you can ask your development team for clarification.

2. Does my testing tool work within SPA environments and what do I do if it doesn’t?

Visual editors are becoming extremely popular in the A/B testing space, for many reasons. One  is that Marketers are developing and launching more tests compared to the DEV team. If your testing tool has a difficult time loading and testing content in the visual editor, the reason could be that  the tool is either not equipped or not set up to properly handle SPAs. This often happens in a secure checkout flow, where the customer is required to step through items like shipping address, billing, etc. The page you are attempting to A/B test on will not properly load in the visual editor and you will receive an error message.

FIGURE 1 – VISUAL EDITOR SPA ERROR MESSAGE

Ask your vendor if your testing tool can detect changes in the DOM and if it has a mechanism to look for timing. 

Here are two challenges that some A/B testing tools face:

  1. Visual Editors: Some A/B testing tools rely on the initial page load to determine what content to modify. These tools may struggle when content needs to change without a page reload. For example, if your test content is on page 3 of your site’s checkout flow, which is an SPA, the tool might not detect the need to inject content changes because there are no page loads as users navigate through the checkout flow.
  2. Timing: As content on the page changes, it can be tricky for an A/B testing tool to insert test variations at the right moment. Variations can’t be applied before the content starts loading, but waiting until the content has fully loaded can result in users seeing the content change, a phenomenon known as “flicker.”

AB Tasty has extensive experience creating A/B tests in Single-Page Application (SPA) environments. We recommend implementing a delay on our tag’s execution so that it only triggers when the page is fully ready. This is achieved using a proprietary locking mechanism. This is just one example of how AB Tasty stands out in the A/B testing industry.

3. How do I take it to the next level?

Once you’ve unlocked A/B testing in SPAs, it is time to brainstorm testing ideas and develop a strategic roadmap to uncover ways to increase revenue for your organization. Here are a few ideas to help you jump-start that process!

  • Test various methods of updating cart quantity.
  • Test product detail page functions such as color variant selection methods.
  • Test buy box functions such as stock amount and store information.
  • Test different shipping messages based on cart value.
  • Test reordering flow steps.
  • Test navigation patterns or menu structures to optimize user flow within the SPA.
  • A/B test various UI/UX elements like buttons, forms, or interactive features specific to your SPA.
  • Test personalization strategies based on user behavior and interactions within the SPA.

Key Takeaways

  • There is enormous value in A/B testing critical areas of the web experience that are often operating in SPA environments, such as an eCommerce checkout flow. This is usually the last stage of any digital customer journey and vital to get right.
  • Determine whether or not your site leverages SPAs anywhere on your site.
  • Dig into your testing tool to ensure it can properly load test content changes with SPA environments.
  • Understand what other AB testing tools are out there and how they handle SPAs.
  • Develop an optimization roadmap based on your new knowledge!

Subscribe to
our Newsletter

bloc Newsletter EN

We will process and store your personal data to send you communications as described in our  Privacy Policy.

Article

5min read

Failing Forward for Experimentation Success | Shiva Manjunath

Shiva Manjunath shares how debunking best practices, embracing failure, and fostering a culture of learning can elevate experimentation to new heights.

In this episode of The 1000 Experiments Club, guest host and AB Tasty’s Head of Growth Marketing UK, John Hughes, sat down with Shiva Manjunath, Senior Web Product Manager of CRO at Motive and Host of the podcast From A to B. Shiva’s journey through roles at Gartner, Norwegian Cruise Line, Speero, Edible, and now Motive, has made him a passionate advocate for the transformative power of experimentation.

During their conversation, Shiva discussed the pitfalls of following “best practices” blindly, the importance of creating an environment where failure is seen as a step toward success, and how companies can truly build a culture of experimentation.

Here are some of the key takeaways.

The myth of ‘Best Practices’

Too often, the so-called experimentation best practices become a checkbox exercise, rather than a thoughtful strategy.

“If you’re focused on best practices, you’re likely missing the point of true optimization,” Shiva notes. 

He recounted a situation at Gartner where simplifying a form—typically hailed as a best practice—actually led to a sharp drop in conversions. His point? Understanding user motivation and context is far more important than relying on one-size-fits-all rules. It’s this deeper, more nuanced approach to experimentation that drives real results.

“If what you believe is this best practice checklist nonsense, all CRO is just a checklist of tasks to do on your site. And that’s so incorrect,” Shiva emphasized, urging practitioners to move beyond surface-level tactics and truly understand their audience.

Embracing failure in experimentation

A major theme of the discussion was the pivotal role failure plays in the journey to success. Shiva was candid about his early experiments, admitting that many didn’t go as planned. But these “failures” were crucial stepping stones in his development.

“My first ten tests were all terrible. They all sucked,” Shiva admitted, underscoring that even the most seasoned experts start with mistakes. He stressed that organizations must create an environment where employees can experiment freely, learn from their mistakes, and continue to improve.

“If you’re penalized for running a losing test, you’re not in a culture of experimentation,” Shiva insists.

Organizations that punish failure are stifling innovation. Instead, Shiva advocates for an environment where employees can test, learn, and iterate without fear. “The idea that you have the flexibility to discuss failures and focus on, ‘Well, I ran this test. It lost. Now, what do we do next?’—that’s a culture of experimentation.”

Scaling experimentation maturity

Shiva also explored the varying levels of experimentation maturity within organizations. Many companies claim to have a “culture of experimentation,” but few truly practice it at scale. Shiva emphasized the importance of making experimentation accessible to everyone in the organization, not just a select few.

Reflecting on the loss of Google Optimize, Shiva acknowledged its role as a gateway into the world of experimentation. “I got into experimentation through Google Optimize,” Shiva recalled, recognizing the tool’s importance in lowering the barrier to entry for newcomers. He urged companies to lower barriers to entry and enable more people to engage with experimentation, thereby fostering a more mature and widespread culture of testing.

The role of curiosity and data in experimentation

Another critical point Shiva raised was the importance of curiosity in experimentation. He believes that genuine curiosity drives the desire to ask “why” and dig deeper into user behavior, which is essential for effective experimentation.

“If you’re not genuinely curious about the why behind many things, I don’t know if experimentation is the field for you,” Shiva stated, underscoring curiosity as a crucial soft skill in the field.

Shiva also highlighted the foundational role of being data-driven in any experimentation strategy. However, he cautioned that having data isn’t enough—it must be effectively used to drive decisions.

“If you’re in a business setting and the business looks at your program and this is zero test wins, right? And then after two years, they would rightfully say ‘is this the way it’s supposed to go?’” Shiva remarked, pointing out that data-driven decisions are key to sustaining a culture of experimentation.

What else can you learn from our conversation with Shiva Manjunath?

  • Why it’s crucial to critically evaluate industry buzzwords and ensure they align with real practices.
  • How true personalization in experimentation goes beyond just adding a user’s name.
  • The need for thorough analysis to genuinely support data-driven decisions.
  • Shiva’s take on the future of experimentation after Google Optimize and how companies can adapt.

About Shiva Manjunath

Shiva Manjunath is the Senior Web Product Manager of CRO at Motive and Host of the podcast From A to B. His insatiable curiosity about user behavior and deep passion for digital marketing have made him a standout in the world of experimentation. With experience at top companies like Gartner, Norwegian Cruise Line, and Edible, Shiva is dedicated to demystifying CRO and pushing the boundaries of what’s possible in the field.

About 1,000 Experiments Club

The 1,000 Experiments Club is an AB Tasty-produced podcast hosted by Marylin Montoya, AB Tasty CMO. Join Marylin and the Marketing team as they sit down with the most knowledgeable experts in the world of experimentation to uncover their insights on what it takes to build and run successful experimentation programs.