Learn / Guides / A/B testing guide

Back to guides

An essential 6-step guide to mobile A/B testing

Mobile browsing isn’t just for on-the-go users anymore. People often pick up their phones or tablets to shop for mattresses or subscribe to accounting software, even if their desktop device is nearby.

To provide a smooth user experience (UX), mobile A/B testing is a must. By testing two different versions of your product or site, you learn which features work—and which need work.

Last updated

28 Jul 2023

Reading time

8 min

Share

Navigating the mobile A/B testing process can be challenging, as it’s easy to get overwhelmed with the sheer number of elements you could change or the various ways to set up your test. 

This guide walks you through a six-step process to conduct effective mobile A/B tests. Learn how to optimize your site or product to create a seamless UX, increase user engagement, and boost conversion rates.

TL;DR

In an increasingly mobile-centric world, designing a smooth UX is essential. Mobile A/B testing gives you the data you need to build empathy with your users and improve your design. 

Follow these six steps to run mobile testing with confidence: 

  1. Review your data: look at mobile-specific performance metrics and qualitative data to find ways to improve your UX

  2. Create your hypothesis: craft a single sentence with the problem, potential solution, and predicted outcome to frame your experiment

  3. Design your variant and test: gather or design any required creative elements for the alternate version, and set up a test in your A/B testing platform of choice

  4. Segment your audience: apply mobile filters to look specifically at data for users on phones or tablets, and then layer in additional filters relevant to your hypothesis

  5. Launch—and complete—the test: launch your test, running it until you’ve gathered the necessary sample size and achieved an appropriate test duration

  6. Analyze your results: evaluate how each design performed and return to your hypothesis. Then, consider the next steps for improving your website or product.

Conduct mobile A/B testing with ease

Use Hotjar’s tools to see how users experience different versions of your product on mobile.

6 steps to effective mobile A/B testing 

Mobile usage has exploded in recent years, dominating other traffic sources. So much so that, in June 2023, the global mobile market share of web traffic was more than double that of desktop.

Mobile A/B testing is no longer a nice-to-have—it’s a necessity.  

Desktop and mobile aren’t just devices with different screen sizes. Users approach each with a distinct mindset, and their experience needs to reflect that. They might pop onto an ecommerce site while waiting for their kid’s piano lesson to end, or check their email while waiting in line for their coffee. To avoid frustrating users, you need an easy and intuitive mobile experience.

Increasingly, optimization is moving ‘mobile only’. It’s been ‘mobile-first’ for a while. That means doing usability testing on mobile, ideating and designing for mobile, and reviewing test previews on mobile.

Johann Van Tonder

That means you need to design and test specifically for mobile. Not sure where to start? 

Read on for our six-step process. 

1. Review your data

Mobile A/B testing is a way to seek out the rough edges and spot problems in your design to figure out how to make it better. 

Do a deep dive into your

Access all of this data from one centralized location—your Hotjar Dashboard—to see what is happening (through quantitative or numerical data) and why (through qualitative or text-based data). 

Make sure to apply the ‘mobile’ filter to home in on device-specific issues. If you spot one, Hotjar lets you see relevant heatmaps, visual representations of how the average user views and interacts with your site, and recordings, which show individual visitors’ taps, swipes, and scrolls.

💡 Pro tip: filter your recordings by traffic channel to dig deeper into user behavior.

Hotjar’s traffic channel filter lets you see how mobile users from different channels behave on your site. For example, a visitor who arrived from paid social ads may experience your site differently than one who clicked a call-to-action (CTA) button in a promotional email. If you find an issue that hinders conversions for a specific group, it may be a worthy contender for an upcoming experiment.

#The traffic channel filter in Hotjar recordings shows you how users from different channels experience your site

The traffic channel filter in Hotjar Recordings shows you how users from different channels experience your site

2. Create your hypothesis

Once you’ve identified problem spots on mobile, choose one to prioritize and create a hypothesis for your experiment. A strong hypothesis outlines the problem, solution, and predicted result—typically in that order.

Let’s say you review data for your product’s landing page on mobile. You see the following: 

  • Average time on page is up, but conversions are down

  • Clusters of angry face icons on your rage click maps appear on non-clickable descriptions of product features

  • Your Engagement Zones heatmap—which combines aggregate click, move, and scroll data to show you the most engaging parts of your site—reveals that the FAQ section at the bottom of your page is getting tons of attention

#An Engagement Zone heatmap reveals the parts of the page users engage with the most, giving you more data to inform your hypothesis
An Engagement Zone heatmap reveals the parts of the page users engage with the most, giving you more data to inform your hypothesis

Your takeaway could be that users need more information before they buy. Then, you’d create a problem-solution-result hypothesis like: 

If users need more information before they buy…then we should move the FAQ section higher up the page…so that at least 10% more visitors convert.

Hotjar saves us and our clients a lot of time. It’s such a convenient and functional tool—it gives us the reasonable, user-backed foundations for our experiment hypotheses, and lets us demonstrate how effective we’ve been through our redesigns at producing KPI improvements.

Dmytro Kukuruza
CEO, Turum-burum

3. Design your variant and test

The more time-intensive part of this process is to gather all the assets you need. Depending on your hypothesis, you may need 

  • A wireframe from your UX design team

  • Code from your dev team

  • Copy from copywriters

  • Custom illustrations from your graphic design team

If you’re only testing button sizes or rearranging existing page elements, you’re ready to set up your experiment

The good news is that the same A/B testing tools that offer desktop testing also let you design mobile tests.  (If you don’t have one yet, consider options like Omniconvert, Kameleoon, or Unbounce—all of which integrate easily with Hotjar. 🙌)

When building your test, consider how to distribute traffic to the two variants. If you run two brand-new homepage options side by side, you may opt to send an equal percentage of traffic to each. If you test an existing page against a new variant, consider diverting a smaller percentage of traffic, such as 25% or 30%, to the new option. 

4. Segment your audience

The next step is deciding who sees each variant. 

You already know you need to segment by mobile. For best results, you also want to target other specific segments of users relevant to the problem you called out in your hypothesis. 

For example, say the issue is that new users abandon your mobile checkout flow much faster than existing customers. For your mobile A/B test, you’d need to target this particular segment. (An A/B testing platform makes this task simple.)

Also consider segmenting by

  • Demographics 

  • Average order value (AOV), i.e. how much a customer typically spends on a transaction

  • Subscriber plan or tier type

  • Length of subscription 

  • User operating system, like Android or iOS 

  • Length of time since last visit

Different user groups have their own unique needs on your mobile site. Segmenting lets you collect the precise data you need to improve the experience for each group.

Define a segment and even context. Don’t expect all users to have the same needs and behavior.

Johann Van Tonder

5. Launch—and complete—the test

Once you hit launch on your mobile A/B test, you’re on your way to getting answers. But you still have two more aspects to consider: sample size and test duration. In other words, when do you end your experiment?

Without a background in statistics, calculating a statistically significant sample size—the number of visitors you need to run a test—is tricky. At the same time, you don’t want to simply pull a number out of thin air. That’s why most A/B testing tools include a test length calculator like this one.

Test duration—the length of time you let the test run—varies, too. If you can get enough visitors to test your variations on day one, should you? Nope. For a well-rounded, representative sample of insights, most experts recommend running your test for two to four weeks.

💡 Pro tip: you don’t need to wait until the test is over to peek at your results. Monitor your test for bugs or issues, and check out how a new design performs while the test is still in action.

By integrating your A/B testing platform with Hotjar—either through our dozens of native integrations or countless others through Zapier—you get a window into how users experience your test variants. 

Say you’re running a checkout flow A/B test in Optimizely. Head to the Recordings tool in Hotjar, and filter sessions based on your Optimizely experiment. Then, add additional filters, like the specific variant or user actions you want to see. 

Spot something interesting? With our Slack integration, send specific session recordings to your team members—and start an early conversation about your findings.

Hotjar’s Slack integration makes it simple to communicate with team members about A/B test results

6. Analyze your results

You’ve come full circle—from data analysis to, well, data analysis. 

Once your test has run its course, look at the performance metrics. Ask yourself questions like

  • How did each of your variants perform? 

  • How do these results compare to each other? 

  • How do these results compare to your predicted outcome in your hypothesis?

  • Which version ‘won’? 

But don’t stop there. Take a closer look at user behavior through relevant heatmaps and recordings, and explore user opinions through feedback and surveys. These valuable qualitative insights show you why users prefer one page over another and help inform future iterations.

A maxim in CRO goes that quantitative data tells you what is happening but not why. It’s easy to forget that all those metrics and graphs are just a reflection of human behavior.

Johann Van Tonder

👀 How The Good used heatmaps and recordings as part of their A/B testing process to optimize mobile UX

While reviewing sessions in Hotjar Recordings, ecommerce conversion rate optimization (CRO) company The Good noticed its client Swiss Gear’s mobile users struggling with the site’s icons and language. So the team developed an alternate mobile design and began cycling through A/B testing.

Maggie Paveza, CRO Strategist at The Good, noted, “After each change, we ran heatmaps and session recordings on that area again to further understand how the results of the A/B test impacted user behavior.”

The team’s design iterations resulted in a simple menu-driven user interface (UI) design for the mobile homepage that promoted top filters. With a streamlined UX on category pages, they reduced the mobile user bounce rate on Swiss Gear’s site by 8% and increased time on site by 84%.

#Hotjar Heatmaps showed The Good how Swiss Gear users experienced mobile content filters (left) and led to an elegant new design (the two designs on the right)
Hotjar Heatmaps showed The Good how Swiss Gear users experienced mobile content filters (left) and led to an elegant new design (the two designs on the right)

Create an effortless, user-centric mobile experience

Following these six A/B testing steps gives you a glimpse into how users experience your mobile site—so you can take the next steps to improve it. 

With results in hand, you’re ready to iterate and redesign to deliver a seamless mobile experience that delights your users and keeps them coming back for more.

Conduct mobile A/B testing with ease

Use Hotjar’s tools to see how users experience different versions of your product on mobile.

FAQs about mobile A/B testing