Last updated Jan 14 2020
How to run moderated usability testing
When you're doing user testing, it's important to standardize the process so you end up with consistent and reliable results.
In this chapter, we take a deep dive into moderated testing—arguably, one of the most complex testing methods—with the help of four experienced moderators who share their five-step process and help you consider all the variables involved. You’re going to be in good hands!
Conducting a moderated usability test
At the beginning of your usability testing process, you'll need to decide what usability testing method is right for you based on:
- Your research goals (what you want to achieve)
- Your resources (how much time and money you can invest)
- The audience you want to test
If you need a quick refresher, we’ve written a separate page about the usability testing methods and tools at your disposal.
One of the most thorough and in-depth methods for gaining user insight is moderated usability testing, during which a trained moderator observes the participants’ behaviors and interacts with them directly. This method is also one of the most complex ones because there are a lot of variables involved—from the location you pick to the moderator’s ability to get valuable answers from the participants.
To write the following section, we relied on the help of four veteran usability testers who talked us through their process to make sure you get the full picture: Els Aerts (founder of usability and conversion company AGConsult, with almost 20 years’ experience in the field), our editor Fio (who has run over 200 usability testing sessions since 2016), and our product designers Craig and Daniel (who are in charge of running usability testing at Hotjar).
A 5-step process for usability testing
We’re going to use an e-commerce website as our example throughout the page, but the points below apply if you want to test a prototype, a non-transactional website, or a product.
Step 1: plan the session
Planning the details of the usability testing session is, in some ways, the most crucial part of the entire process. The decisions you make at the start of the testing process will dictate the way you proceed and the results you end up with.
Determine the nature of your study
- Define the problems/area you want to focus on: what is the purpose of the test? What areas of your e-commerce website would benefit the most from usability testing?
- Type of users you want to test: typically, these are representative of your user personas, but you may want to drill down more specifically on a certain segment (e.g., users who have completed a purchase in the past 30 days).
- Questions you want to ask: what are the specific questions you want to ask users about your website? What are you trying to find out? (note: we go into this subject in depth in the usability testing questions chapter).
Logistical details of your usability testing sessions
- Location: will you do the testing in your office? At a research lab? Over the internet?
- Timetable: when will you run the testing sessions? (This is particularly crucial if you are inviting participants to a research lab, which in turn means: you need to know when to book the lab.)
- Moderators: who will run the testing sessions? As we will see below, moderating user testing without influencing the results requires skill and practice, so you should consider either hiring trained moderators or arranging training for yourself.
- Recording setup: recording testing sessions gives you the chance to review them later and catch all kinds of data that the moderator might miss or not have time to record. If you decide to take advantage of video or audio recording, you'll need to be familiar with the equipment and its installation. In an ideal situation, you want to record the participants’ screens, their speech, and also their body language—all of which you can easily do in a testing lab.
Collect all this information in one centralized place (bonus points for creating a one-page template you can reuse multiple times), and use it as your main guide towards the next steps: recruiting participants and designing the actual session.
Step 2: recruiting participants
Whom you recruit, and how, depends on your testing goals (for example, how much information you want and therefore how long your sessions need to be) and your budgetary constraints.
The most popular ways to find participants for your study:
- Hire an agency: if you're looking for a very specific subsection of the population (like web-savvy oncologists, or single mothers under 35), the most efficient way to find them is to hire a specialized recruitment agency. These companies have vast resources for finding desirable candidates and can do so very efficiently.
- Use your website: if you already have an established user base, recruit people there. Use a pop-up poll (Hotjar can help with this) to find users who are willing to participate.
Note: this is why you first need to take step 1 and plan how you’re going to run the test—an on-page poll can help you get in touch with volunteers from all over the world… but if you’re testing in a lab, only a tiny fraction of them will be able to join you.
- Use social media: if you have a social media following, use your social channels to reach out to potential participants.
- Recruit your clients: reach out to your clients/customers directly and ask if they would be willing to help (provided they’ve given you consent to be contacted for these initiatives. You don’t want to spam them unnecessarily!).
Pro insight from Fio: “you may think you are ‘bothering’ your customers and be hesitant to reach out, but I’ve often found that the opposite is true. People are generally flattered when you ask for their opinion, and genuinely curious to see how their thoughts can help you.”
However you recruit your participants, you'll need to compensate them for their efforts. Els recommends gift cards if you're based in the United States, and cash for most other parts of the world. The amount you pay is up to you—usually, anywhere between $30-50 dollars for a non-specialist audience is acceptable.
Step 3: designing the task(s)
This step (designing the task) and the previous one (recruiting participants) really happen around the same time. Once you've worked out the why and how of your research, and while you wait for participant confirmation, it's time to design the test itself.
What this means: you’re going to carefully plan the specific scenarios you’ll take your participants through, and the tasks your participants will be required to complete, to guarantee clear and actionable results. Els does this by writing specific scenarios that provide a context for the testing tasks. For example, let’s say you’re testing an e-commerce shop that sells clothes:
Make a scenario like: “You've been invited to a theme party, and the theme is red. Everybody had to come dressed all in red. You look in your wardrobe, and you don’t have anything that would work. Well, it’s time to buy something red. How do you go about it?” So now your participants will go to the website, and you know that there is a product filter for ‘red’. Can they find the filter easily? Do they know they can use filters? Et cetera, et cetera. Then, you can just watch them use, or not, the red filter.
This scenario allows you to test the participants’ ability to use the filters on your website and is open-ended enough to apply to anyone's preferences. You want to avoid using scenarios that are too specific (e.g., asking a participant to pick out a milk foamer when they may not even like milk in their coffee).
Another pro tip: when designing scenarios, keep the most important functions of the website in mind. For an e-commerce website, the top task is usually buying something, so you would probably want to include a scenario that nudges the user through the purchasing process. Els recommends giving the tester real money to spend during the user test. “When they're spending their own money, they get a lot more critical,” she says. “We sometimes have to interrupt people, because they will happily spend 45 minutes choosing the right pair of shoes—much like they would do in real life.”
Step 4: running the session
When it’s time to conduct the usability testing session, you or your moderators should follow a set protocol with each participant. This protocol leaves some room for customization but still guarantees an overall standardized experience for each test subject.
Let’s assume you are running the session yourself: here is what you do.
Introductions and warm up
If you’re doing an in-person test, make sure your participant is physically comfortable with the testing setup (chair, desk height, mouse placement, etc.) and that they understand what's going to happen during the session; if you’re doing it remotely, make sure they can hear you properly.
If you are recording the session (for example, because you want to review it later), ask for their permission at this point; if you are running in-person testing, you may even have a printed out form that you ask them to sign.
To get your subject to loosen up, ask them some friendly conversational questions, such as how far they’ve traveled to get to the lab, if they’ve done user testing before, etc. For Craig, “the important thing for a new interviewee coming in is to feel relaxed and comfortable in the testing environment. A big goal of the first few minutes is getting to know each other, build rapport. This helps you make a smooth transition into the testing phase, when the tester hopefully doesn't even really realize that you're shifting gears, as you start to collect more specific information from them.”
Collect pre-testing data
During your conversation, collect demographic and psychographic information using predetermined questions. In the case of your e-commerce test, you might want to ask:
- -When the last time was they shopped online
- -How often have they bought something online in the past 6 months
- -How they generally go about finding the products they want
- -What influences their buying decisions
Transition into the first task
Use the rapport you've built to transition the participant into the first testing task. You would usually have 3 or 4 scenarios you want to go through, but the order in which you complete them may depend on your participant's mood and skill level. This is where it pays off to be a trained moderator. Fio notices: “you must be able to sense if your participants are getting frustrated, which may indicate you need to switch them to an easier task to build their confidence; or you may also find the super-skilled participant that completes the task in a really short time, which is where you need to be good at probing and investigating why they did what they did.”
In an ideal scenario, you have a second person taking notes for you so you can be 100% focused on the relationship with your participant; when Daniel and Craig run user tests, they record the sessions and get them transcribed so they can later go through them and highlight relevant parts or sentences.
Follow up questions and wrap-up
Reserve some time at the end of the session to ask any follow-up questions and collect the participant's final feedback. Be sure to thank them for their help.
Dos and don'ts of session moderation
There's an art to running a moderated test section that involves establishing a rapport with the subject and naturally guiding them through the tasks. Our four veteran testers gave us their top tips for being an effective moderator:
- Do use clear, neutral instructions. You have to make absolutely sure that your question is not open to interpretation.
- Don't write the tasks down—or, at the very least, don’t follow the task list verbatim, as it can give the proceedings too much formality. The participant will feel more at ease if you personalize the task and your wording based on the context.
- Do watch for verbal cues and body language. Sometimes users won't explicitly say they are confused, but a skilled moderator can tell by their actions. An example would be someone who was previously silent saying “Hmm . . .” or sighing in frustration.
- Don't speak too much. You want to interfere with the user's thought processes as little as possible; set the task, ask them to speak out what they’re thinking and what actions they’re doing, and quietly observe how they go about it.
- Do keep an even tone. Don't agree or disagree with the user too much; doing so might influence their final opinions (as in: they might try to ‘please’ you and tell you what they think you want to hear). Try to be as neutral as possible with your speech and body language.
- Don't take control of the task. As soon as the test starts, the user should be in total control. Never take their mouse or navigate a step for them.
- Do know the best ways to interject. If you do need to interrupt or answer a question, deflect as much as you can by using the echo, boomerang, or Columbo techniques that are explained in depth in the testing questions chapter.
- Don't look at their screen too much. Sometimes, observing the user too closely can influence their behavior. “I tend to just pretend I'm writing something,” Els says.
Step 5: analyzing the insights
Finally, after you've collected all your data, it's time to analyze the results and make conclusions. Try to do this as soon as possible after testing so that the observations are fresh in your mind.
As you go over the data, pull out the most serious or frequent problems that users encountered for further examination. Daniel uses a color-coded system when he reviews his transcripts, so he can group similar situations together.
Don't address every single thing that went wrong; instead, prioritize the issues that were most problematic and need to be workshopped and resolved.
Read more: read the chapter on analyzing and evaluating usability testing data for a straightforward, 5-step approach.