Learn / Guides / Usability testing guide
How to conduct a usability testing audit in 5 steps
Even tests must be put to the test. With a usability testing audit, you do just that: systematically inspect your processes to ensure they’re efficient and effective.
But how can you tell whether your go-to usability testing procedures make the grade?
In this guide, we help you understand what a usability testing audit is and why it’s important. You’ll also learn a five-step method to discover how to improve your usability testing and create a frictionless experience users will enjoy.
A step-by-step guide to conducting a usability testing audit
Your usability testing process might work—i.e. yield insights—but it still needs a check-up every now and then to improve its effectiveness.
A usability testing audit is an exploration and examination of the process you use to test your user experience. This ensures you collect the data you need to continuously improve your product or website for your users.
Through an audit, you might discover that you’re using an ineffective tool or your questions lead participants astray. And with each usability testing misstep you make, you miss out on opportunities to learn what users really want from your product.
1. Review your goal-setting process
To get the most out of usability testing, you have to begin with the end in mind. You need to determine what you need to learn by completing the test to improve your UX.
During an audit, ask yourself these questions:
Did you communicate with stakeholders and team members? Gathering multiple perspectives—from board members to UX designers to marketers—before you plan and run your usability test helps you learn about common concerns and prioritize questions to ask.
Do you have a clear purpose for testing? Your purpose helps you choose a testing method later on—so it’s essential. Think about the input you received from stakeholders and team members when creating your goal. You may want to compare the performance of two different interfaces or find ways to reduce friction in your new checkout flow.
Did you establish success metrics from the get-go? In usability testing, you track task completion rate, the percentage of users who completed the activity successfully. But what percentage do you consider acceptable or desired? 78%? 85%? Determine your goal metric before you conduct a test.
2. Consider your test design and tools
Once you know why you need to run the test, you can decide on a usability testing method and select the best tools.
The beauty of a usability testing audit is that it lets you re-evaluate your choices, so you feel confident in your testing method and software platforms.
Does your method suit your purpose? Moderated tests are led remotely or in person by a trained moderator, allowing for live follow-up; unmoderated tests—often a less expensive option—ask users to attempt the task on their own without guidance. (Not sure which one to choose? 🤔 Unmoderated is perfect for examining overall behavior patterns, and moderated works well when you want a deep dive into user behavior.)
Are you using the most effective tools for the job? You might find your usability testing tools lack the necessary features for an efficient and reliable testing experience. For example, if you frequently collect large sample sizes, a tool with filters for locating specific data quickly is a must.
💡 Pro tip: choose a tool that lets testers participate from the comfort of their own homes.
“With remote working becoming commonplace, we noticed a shift in mindset where both clients and participants prefer usability testing to be conducted remotely,” says Laura Paplauskaite, Founder and Head of User Experience at Bit Zesty.
For moderated remote tests: Hotjar Engage lets you conduct remote usability testing with interviews. You can find and pay your participants, and schedule and record your sessions. Plus, you get an automatic interview transcription to review later.
For unmoderated remote tests: Hotjar Recordings lets you see how users interact with your site or product. You can use User Attributes to home in on specific demographics or filter by frustration signals like rage clicks.
3. Assess how you write task scenarios
Writing a task scenario sounds simple—you simply tell users what you want them to do with your site or product. But the success of your usability testing hinges on your prompts: your data will only be as good as the task scenarios you write.
Ask yourself these questions:
Is your task clear and complete? Follow the Goldilocks rule here: your task shouldn’t be too complicated…or too vague. It should tell users their goal and provide just enough context (why they want to complete the task) and details (like the dates or names you want them to input).
Do you avoid leading the participant? Don’t borrow directly from your user interface (UI) copy. Say your call-to-action (CTA) button reads ‘Start a conversation’. Your task shouldn’t include the same language (e.g. ‘Then, inquire to start a conversation with our sales rep’). Otherwise, you give testers a hint—and artificially inflate your task completion numbers.
Do you give directions or ask questions in an unbiased way? Use neutral language in your prompts and follow-ups. Avoid questions like, "What did you find difficult about this task?" This presumes the user found something difficult, which may influence their response (and your data).
4. Evaluate how you recruit participants
Your company likely caters to a very intentional, specific target audience—and your usability testing should reflect that. If you own a medical supply ecommerce store, it doesn’t make sense to recruit elementary school teachers or bankers for your test. Instead, you’d look for participants that reflect your ideal customer profile (ICP) of healthcare professionals and medical sales reps.
Do you ask the right screening questions? You may ask people demographic questions (age, profession, highest level of education, and household income). Or, you might ask questions about their habits or behavior, like, ‘When was the last time you purchased a product on your mobile device?’
Does your sample size make sense for your project? Recruit enough participants to find patterns in your data. When running moderated testing, you likely need at least five. (Tip: if your budget’s a limiting factor, look for tools with robust free plans. Hotjar’s free forever Basic plan gives you access to 35 free session recordings daily, along with up to 60 Engage interviews from your own network yearly.)
Are you selecting a diverse group of participants? Your test participants should reflect the diverse perspectives of your product’s actual users. For example, if you target marketers on an iPhone device, you could still diversify by choosing participants from different countries and age groups.
5. Examine your test procedure
Yep, we’ve finally arrived at the test itself. By the time you reach this point, most of the hard work is already done. But you still want to make sure you execute your test like a pro.
Did you conduct a pilot test? Before conducting live usability testing, conduct a run-through with a co-worker. Choose someone who hasn’t been involved in planning the usability test—someone with an outside perspective. That way, you identify confusing wording with your task or refine questions that don’t lead to helpful insights.
Do you encourage the user to think aloud? When a participant thinks aloud, you get precise insights into each click or scroll as they navigate your product. Otherwise, they might forget by the time they get to the post-test interview. Ask them to describe their actions and any associated thoughts or feelings in the moment.
Do you allow for follow-up questions? You likely write open-ended questions to ask after the test, such as, "What was your overall impression of ______?" Let yourself stray from the script to follow up on brief—or intriguing!—responses.
💡 Pro tip: don’t forget to discuss your usability testing results with your co-workers. Invite a cross-functional team to a Hotjar Watch Party 🍿to watch recordings of users interacting with your website or product. Your teammates might spot key issues you missed, or suggest ideas for user-centric improvement.
Audit your usability testing process to put users first
A usability testing audit lets you tighten up each stage of your testing process—for now. Remember that an audit is an iterative process. For continuous improvement, you need to conduct one every time your website, testing tools, or users change.
Implement our usability testing audit checklist, and you’ll be well on your way to improving your testing process, gathering insightful data, and creating a smooth UX users love.