Learn / Guides / UX design
UX analysis: best methods and key tools
User experience (UX) analysis helps you understand how users experience your website and product, so you can prioritize decisions about how to improve it for them and ensure the user experience is as frictionless and intuitive as possible.
Last updated27 Sep 2021
What is UX analysis?
UX analysis is the process of collecting and evaluating data about how your users are experiencing and interacting with your product, then using that data to enhance the user experience.
Effective UX analysis will give you a series of actionable steps or changes to make to better address users’ desires and pain points within your product. Your product team can implement or test these changes to see improvements in conversion rates, brand loyalty, customer retention, and referrals.
UX analysis is generally concerned with two types of data:
Quantitative UX data: numerical and measurable metrics
Qualitative UX data: subjective user insights
What is quantitative UX data?
Quantitative data is numerical and measurable. Metrics and customer satisfaction rankings (like the ones listed below) give you valuable insight into the most common issues with your product, and their severity.
Quantitative data for UX includes:
Success rate: the percentage of users who complete a specified task, such as product onboarding, upgrading to a paid plan, or exploring a new feature
Error rate: the percentage of users who encounter an error or blocker—like a broken link, missing element, or confusing navigation—while trying to complete a task
Time to complete task: the average time users took to complete their task
How to collect quantitative UX data
On-site surveys help you collect user ratings and feedback with text answers, radio buttons, and checkboxes. Ask people for ratings on a numerical scale as they use your product to collect quantitative data about their experience. Ask closed-ended questions like:
How satisfied are you (on a scale of 1-5) with our [product or service]?
How likely are you to recommend our product or service to others (on a scale of 0-10)?
How much effort (on a scale of 1-5) did it take to [complete a task] within our product?
Unfavorable scores on any of these metrics can indicate UX issues. At this stage, you won’t know what they are, but you will know they exist. The role of qualitative data is to help you identify UX issues more precisely—more on this later.
User feedback widgets
Data from feedback widgets helps you identify elements and features that are blocking or causing frustration for your users, but at this stage, you won't know exactly how to improve them. You'll find out in the qualitative data section of this guide.
Heatmaps generate visual data to show what your users are doing on your web pages—where they click, how far they scroll, and what they look at. There are three types of heatmap, each giving you information on a different aspect of the user experience:
show where desktop users move their mouse on a web page, suggesting what they may be looking at.
Analyzing heatmaps data helps you discover blockers in your user flow, but at this stage you won't understand why they're happening or how to fix them. That's where qualitative data can help.
What is qualitative UX data?
Qualitative data comes from subjective insight. Direct voice of the customer (VoC) feedback and verbal or written user insights tell you why customers behave a certain way, and help you identify and remove pain points.
Qualitative data for UX can reveal:
The drivers generating user interest in your product or service
The barriers that stop users from completing an action
The hooks that persuade users to convert
How to collect qualitative UX data
On- and off-site surveys
On-site surveys are short and simple, so they don't disrupt the user journey. They typically pop up or slide in from the edge of the page with 1-3 quick open-ended questions. You can trigger on-site UX surveys to appear only on some pages or after a certain action, which makes them perfect for gathering feedback on specific elements of your product.
Off-site surveys can be placed on a standalone page and are targeted to user segments to learn about their experience in detail. Use off-site surveys to ask a long series of open-ended questions covering the whole user experience to understand the user’s perspective in their own words.
Lab usability testing
Lab usability testing is all about watching real users as they interact with your product. Users complete tasks on computers or mobile devices while a trained moderator observes and notes areas of confusion and opportunities for optimization.
Lab usability tests are run under standardized conditions, making them useful comparison tests between product variations. However, these tests can be expensive and may not reflect your user base or real-life situations.
With the understanding of user behavior gained from lab usability tests, you can create and test product and design variations to optimize different parts of the user experience.
📚 Read more: learn about seven other methods of usability testing, when you should use them, and why.
Session recordings are renderings of real users' actions on your website from page to page, like mouse movement, clicks, taps, and scrolling. Session replays show you when and where users stumble, u-turn (return to a previous page), rage click (repeatedly click on the page), or exit.
Analyzing session recordings will help you identify UX issues, including blockers concerning functionality and accessibility, and show you how to fix them.
💡 Pro tip: if you're using Hotjar, connect feedback responses to Session Recordings to understand how what your users said relates to what they experienced.
What quantitative and qualitative data tell you
When you combine quantitative and qualitative data, you get a complete picture of the user experience:
Quantitative data helps you to identify what issues your users experience, and their degree of severity.
Qualitative data gives you an understanding of why and how these issues affect your users, and how to fix them.
For example, quantitative data might reveal that users interact with certain page elements, but they don't click your call to action (CTA) button. Once you’ve identified this issue, qualitative data can show you why users aren't taking the next step, and what you can do to persuade them.
Combine the quantitative powers of Google Analytics with the qualitative powers of Hotjar
Combine a traditional web analytics tool like Google Analytics (GA) with a product experience insights tool like Hotjar to connect the dots between what is happening on your site and why, so you can find out how to improve the user experience for your customers.
📚 Read more: learn how to use GA and Hotjar together to grow your business.
6 steps to UX analysis
Now you’ve collected UX data through one or more of the methods mentioned above, it's time to make some sense of it: UX analysis is all about organizing your data and looking for patterns and recurring issues.
Here are six steps to UX analysis:
1. Identify user issues
When you first review your UX data you’ll be looking at hundreds, possibly thousands of data points. Qualitative data will help you to identify the most common user issues.
Here are three examples of issues you might want to focus on:
Being unable to complete user onboarding
Not knowing how to upgrade to a paid plan
Difficulty accessing a product feature
2. Organize your UX data
Organize your UX data around issues your users encountered while performing certain tasks. These issues could include getting frustrated and abandoning a half-completed profile, not being able to process a payment, or feeling confused when navigating a product page.
What was the issue?
What actions did they take?
What feedback did they give?
Add categories and tags so you can sort and filter your data. For example:
Category (location): payment, onboarding, upgrading,
Tag 1 (element): payment, icons, menu
Tag 2 (experience): confusion, disappointment, hesitation
3. Look for recurring issues
Your UX data and user feedback will point you towards common user issues. Analyze session recordings to understand why these issues exist, then:
Group issues involving the same tasks, like completing a profile, processing a payment, or navigating a menu.
Tally the number of users experiencing identical or closely related issues.
Look for patterns and repetition to identify recurring issues. If you notice that some users couldn’t find your support center and other users found it hard to find your email address, you might conclude that your contact details are hard to find.
4. Prioritize fixes
Now you have a list of issues with your product and you know what's causing friction in the user experience, it's time to categorize and prioritize fixes.
Use a system like this one:
Critical: users find it impossible to complete tasks
Serious: users are frustrated with their experience and are quitting their tasks
Minor: users are annoyed, but not enough to quit
Decide which UX metrics you want to prioritize. For example, if your metric is user retention, being unable to complete payments is more urgent than disliking the product design.
5. Share your findings and recommendations
You’ve evaluated your UX data and have prioritized the most urgent issues. Now it's time to compile a report to begin testing improvements, and share your findings with your product team.
A good UX analysis report should:
Highlight the most urgent issues
Be specific about the nature of each issue
Include evidence like videos, screenshots, and transcripts
Recommend solutions that are effective and efficient
Include positive findings to let your team know what’s working well
6. Build and test new features
The Lean UX model is a three-phase approach for processing UX feedback:
THINK: product teams brainstorm possible areas of improvement based on
MAKE: product designers and developers build a new feature to solve a user problem
CHECK: product teams test the new feature with surveys and to figure out if users respond well to it. When your product team is doing UX analysis in the CHECK phase, you'll test changes and fixes to learn whether they work, to see how users respond, and refine your approach.
Then, rinse and repeat.