Learn / Blog / Article
User feedback: how to collect and measure it year-round + user feedback tools and examples
When we first launched Hotjar, success was anything but guaranteed. We were completely unknown, didn’t have a cent of outside funding, and were facing some well-known competitors.
Five years later, we’ve gone from €0 to €17 million in Annual Recurring Revenue and Hotjar has been installed on over 670,000 sites. And a huge factor in reaching this level of growth has been our commitment to staying close to our customers.
In this article, we share the exact methods we use to stay in touch with our customers year-round, and how we collect and measure user feedback to constantly improve our website and product experience.
✏️ Note: we are a Software-as-a-Service (SaaS) company, but the techniques we cover in this post are highly applicable to your e-commerce and/or lead generation business.
Table of contents
6 types of user feedback (and our favorite tools to collect it)
Get started: 5 best questions to ask for effective user feedback
What is user feedback?
User feedback is information collected directly from users/customers about their reactions to a product, service, or website experience. This feedback is collected with a variety of tools, such as Customer Satisfaction (CSAT) or Net Promoter Score (NPS) surveys. User feedback and insight are used by UX designers, researchers, and marketers to improve the user experience.
Why user feedback is so important at Hotjar
For us, getting close to our users and even 'obsessing' about them isn’t just an empty phrase: it’s what drives everything we do.
It’s what leads our product team to go through every single piece of incoming customer feedback every month to spot trends and make improvements
It’s why our CEO David Darmanin sat down with a glass of wine and a box of Belgian chocolate to read and analyze over 3,000 open-ended responses to a customer survey in one evening, to understand what makes our customers happy
And it’s what led our Director of Customer Experience to take a customer’s request for a hand-drawn dinosaur seriously and send this reply:
It’s a huge investment of time, for sure—but it’s the most important one we could make for the long-term growth of Hotjar.
6 types of user feedback (and our favorite tools to collect it)
The way we collect user feedback breaks down into three stages with six touchpoints total:
The proactive touchpoints are where we go out of our way to find out how our users are doing early on in their experience with Hotjar and discover what we could be doing better
The reactive touchpoints are triggered by a specific interaction with us, such as a support ticket or when someone downgrades their account
On-demand is how we allow our users to reach out at any point in their experience with us. For all three stages, our main focus is on the qualitative side of the experience (i.e. what our customers' experience, WOW moments, and frustrations were in their own words) as opposed to the quantitative one (measuring usage of the product, how long did our users stay with us, etc.).
Here is a breakdown:
1. Point-of-conversion survey
When: within seconds of purchase
Where: on page
How: Hotjar script tag or page-specific survey
We show the post-purchase survey to new customers seconds after our users sign up for a paid plan. We ask people how their payment experience was on a scale from 1 (hated it) to 5 (loved it) when the credit card is still in their hands and the experience as fresh as possible.
This type of survey helps us uncover direct insights for improving the experience—and since our customer just converted, these responses are as qualified as you can get.
Most powerfully, we can use our own Session Recording tool to watch session recordings of people who had a bad experience.
WATCHING SESSION RECORDINGS OF USERS WHO HAD A NEGATIVE EXPERIENCE IS ONE OF THE ULTIMATE FORMS OF CUSTOMER EMPATHY
The point-of-conversion survey allows us to uncover the exact negative experiences that almost stopped our customers from purchasing. When customers sign up for a new product there are usually some pain points in the onboarding process. By fixing these issues, we’re able to make the experience smoother and more positive for everyone else.
✏️ Read more in-depth details about the exact list of questions we ask + learn how to set up your own point-of-conversion survey.
2. Customer Effort Score (CES)
When: 1-2 weeks after signup and the customers has started using a key feature
Where: on-page survey
How: Hotjar script tag or page-specific poll
Customer Effort Score (CES) measures how much effort it took to complete a task on a scale of 1-5 or 1-7. It's often used to measure support interactions, but at Hotjar we use it to measure our tool's ease of use.
The very first time a user uses a new feature like Heatmaps, Recordings, or Surveys, we ask them to what extent they agree that Hotjar is easy to use, on a 1-7 scale:
We set the survey to only show up just once, regardless of whether the user answers or not.
Here are the results from our CES:
1 = HOTJAR IS DIFFICULT TO USE, 7 = HOTJAR IS EASY TO USE
At first glance, the results look good, with 47% of our customers giving the highest score possible.
But here’s how we look at it:
Almost 50% of our customers gave us a 4, 5, or 6. That’s half of our customers not having an optimal experience—and a huge opportunity for us to improve it for them. (Side note: the 1s, 2s, and 3s only make up around 3% of responses, which means we’re doing a pretty good job of not having a terrible experience. And the overall impact of helping them would be minimal compared to bumping more people up to a 6 or a 7.) Again, session recordings play a super valuable role, for example, to view sessions where a user said using Heatmaps was difficult. We’ve also gotten feedback through our CES survey that has allowed us to build an even better Hotjar:
ACTUAL FEEDBACK WE'VE GOTTEN FROM OUR USERS VIA OUR CES SURVEYS
Practical example: At one point, we noticed a trend across lower CES scores that caught our attention: quite a number of our customers were complaining about their experience with Hotjar’s Recordings.
So our product team started having 1-on-1 interviews with these customers (and that's another type of user feedback for you!). That helped the team map out the experience a typical customer might have while using Recordings:
A CUSTOMER'S POTENTIAL EXPERIENCE WITH HOTJAR RECORDINGS
Through this mapping exercise, we realized there were more potentially negative experiences than positive ones with the tool. So we added better filtering and segmentation of our users’ recordings as a direct result—all thanks to the responses from our CES survey.
3. Net Promoter Score® (NPS)
When: 30 days after signing up for a free trial and 15 days after becoming a paid user, repeated every year
Where: Hotjar dashboard
How: Hotjar poll/script tag/customer attribute
After our customers already converted, have been onboarded, and are starting to use the product more and more, we dive deeper into their experience using a Net Promoter Score (NPS®) survey.
The goal with NPS is to learn whether our customers would actually recommend us—and if not, what’s holding them back. Since referrals and word-of-mouth account for 40% of new signups for us, getting the answers to the right questions is essential to our success.
We calculate the score and process the responses once a month, then present the results to the whole team.
But for us, the most valuable part of NPS isn’t the score: it's the responses we get to the follow-up questions, where we ask people what we should do to improve their experience. That's where we get detailed, actionable answers that end up impacting our product development roadmap:
An NPS survey helps us learn what expectations people have of Hotjar that we aren’t meeting and how we can improve. That kind of feedback is absolute gold for building a customer-centric product.
Practical example: In 2018, our Product team had come up with a plan to update one of our tools. Our CEO believed that the plan was becoming too complicated—but rather than voicing his personal opinion, he pointed to our NPS results which consistently prove that simplicity and usability are one of our main selling points. It was as simple as that—the team, empowered by customer feedback, went back to the drawing board.
Out of the three proactive touchpoints—Point-of-conversion, Customer Effort Score, or NPS—if there’s only one you can do, it’s NPS. It’s the most qualified, with the people who have stuck around, and it helps you understand what to improve to increase your referrals. This is really, truly the most interesting one.
✏️ Read more: we wrote an entire guide toNet Promoter Score if you want to find out more about where and how to implement it.
4. Customer Satisfaction Survey (CSAT)
When: when closing a customer support ticket or at the end of a customer success call
Where: email
How: Zendesk and Go-to-Webinar add-ons
At Hotjar, we’ve always believed that the service we provide on top of the product we sell is just as important as the product itself—maybe even more, because with a digital product the service is the only human element you have.
That’s why we strive to create as many WOW moments as possible during our customer interactions:
And since these positive interactions often lead to word-of-mouth referrals, the effort pays off. So it’s critical that we understand how well our service interactions with our customers are going: and for that, we use a CSAT survey.
24 hours after a support ticket has been solved or after a customer success call, we send an email using Zendesk or Go-to-Webinar asking our customers to rate their experience with our support team:
OUR CSAT SURVEY FLOW
We track the score every week and note down actual responses that stood out:
A SLIDE FROM THE SUPPORT TEAM’S WEEKLY REPORT
As with the previous touchpoints, it’s not about the score so much as the follow-up responses. And for CSAT, we make sure to link the feedback we’re getting from the other touchpoints.
Practical example: Our average response time might hit 25 hours or more in a given week. Not terrible, but certainly not great. But the CSAT score for the week could still be 98%. And since our NPS feedback isn’t pointing to any big problems with the response time, we know we don’t need to double down on and improve it right now moment.
Understanding these customer satisfaction metrics also allows us to discover where we need to improve things. What are the bottlenecks? What are the barriers on our side that are keeping us from providing an even better customer support experience? Keeping a finger on the pulse of our customer interactions using CSAT lets us track whether the changes we’re making are having a positive impact or not.
We also use the CSAT format in our help articles. At the end of each, there's a yes or no question asking if the article was helpful:
If people click "No," an on-page survey pops up to ask "How can we make this Help section useful?" The answers are great for optimizing our documentation and encouraging users to self-service as much as possible.
5. Retention survey
When: immediately when a customer downgrades
Where: Hotjar dashboard
How: embedded survey
Having a retention survey (aka a survey that asks why a customer downgraded their paid plan) is something we’ve found to be critical, especially at times when the number of customers downgrading is going up or down and we don't have a real clue why.
THE SURVEY WE SHOW TO ALL CUSTOMERS WHO DOWNGRADE
We’ve learned so much from this survey that we made it a requirement to fill it out before a customer downgrades. It’s powerful to uncover the reasons in our customers' own words and understand what we need to do to make it likelier that they’ll stick around longer next time.
ACTUAL CUSTOMER FEEDBACK WE'VE RECEIVED FROM OUR RETENTION SURVEY
We also make sure to ask downgrading customers whether they are likely to upgrade again in the future:
This lets our product team give more weight to feedback from people who are likely to come back. It also plays a big part in defining our product roadmap and deciding on what we’ll build next.
What was interesting with our retention survey was that 61% of the answers were from dormant users or users who love Hotjar but just aren’t using it right now and will reactivate at another point. That was a big surprise for us and tells us what a huge potential we have in helping our users get more value out of Hotjar on an ongoing basis rather than just project-to-project.
6. On-going user feedback
When: always on
Where: in-app everywhere
How: Hotjar’s Incoming Feedback
This last one is the catch-all website feedback tool that lets our customers give us feedback whenever and wherever they want. Its real value is how at the moment it is—especially when it’s coming from qualified customers. Allowing detractors to walk you through their pain points or what went wrong can provide valuable (if painful) insight into your product/service’s functionality and how to improve it.
We know from our own negative experiences using websites that when something annoys us, we don’t want to fill out an online feedback form or open a support ticket: we want to let someone know right away. And if we can’t, we leave. Probably upset, and not very likely to recommend the company.
We don’t want our customers to do that, which is why we use Incoming Feedback—a small widget that lets users immediately give feedback in real-time at any point in their experience:
INCOMING FEEDBACK IN ACTION
The responses we’ve gotten from Incoming Feedback have been critical in helping us uncover what we need to fix right away to keep our customers happy. Practical example: During one of our product team’s monthly investigations, they spotted negative feedback from users whose heat maps were not showing up correctly:
THE BIG BLACK SQUARE IS WHERE A HEATMAP SHOULD BE
This alerted the team to a possible issue. They quickly dug into the other feedback touchpoints, and found support tickets of people giving negative CSAT scores and complaints in the NPS survey. Watching recordings further confirmed the issue—so they jumped on it to solve it as quickly as possible.
Our support team also combs through the responses and replies directly to people having issues:
INCOMING FEEDBACK RESPONSES
We also use Incoming Feedback to track the overall user experience, with all its ups and downs:
INCOMING FEEDBACK RESULTS
Incoming Feedback is really important for us at Hotjar, especially since we’re making changes often. It’s good to move fast and break things often, but you have to allow people to tell you that you’ve broken something. That's all you need, and you have to act on that.
Get started: 5 best questions to ask for effective user feedback
As you've seen so far, we use six methods to measure and improve our users’ experience year-round, but you don't have to use them all right away—in fact, you can get started on a much smaller scale to find out:
What’s bringing people to your website
What’s stopping them from converting
Why some users are converting
You can find out this critical information by setting up an on-page survey with Hotjar and asking your visitors a handful of carefully chosen questions.
1. How can we make this page better?
Where to ask: any business-critical website page, especially pages with high exit rates
What you’ll get: a list of problems you may not have know you had, with potential solutions directly suggested by your visitors.
2. Where did you first hear about us?
Where to ask: homepage, landing page
What you’ll get: better insight into where customers hear about you that traditional analytics can’t track (especially word-of-mouth referrals or offline campaigns).
3. Why are you looking for [product or service] today?
Where to ask: product pages, pricing page
What you’ll get: a clearer understanding of why customers are interested in your product/service. You also get to collect feedback in your users’ own words that you can re-use in your product marketing copy.
4. What, if anything, is stopping you from [taking action] today?
When to ask: exit survey when users are leaving the site, abandoned cart follow-up
What you’ll get: understanding of technical issues with your purchase process and which objections you need to better address on the page.
5. What persuaded you to [take action] today?
Where to ask: post-purchase survey or on a success page
What you’ll get: your unique selling points (USPs) in your users’ words when the experience is fresh in their minds
📚 Read more: if 5 questions are not enough, check out this 2019 collection of 28 customer feedback questions. 🏆 Pro tip: the 5 examples above are open-ended questions, which can be answered in-depth and allow for original, unique, and potentially lengthy responses. We recommend going through every single response manually (it's time-intensive, but helps you empathize with your customers), and here is a handy guide to help you analyze answers to open-ended questions with a simple spreadsheet.
Start collecting user behavior insights today
Get in-depth insights and conduct customer behavior analysis with Hotjar’s suite of tools.
One final thought:
Although collecting and analyzing feedback can seem time-consuming, a simple piece of feedback can change your product design or website for the better. So tracking the responses from each touchpoint won't mean much if you don't act on the feedback and insights. Building a truly customer-centric business is not about how much feedback you collect—it's what you do with it that counts.
Net Promoter, Net Promoter System, Net Promoter Score, NPS, and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld, and Satmetrix Systems, Inc.
Related articles
User research
5 tips to recruit user research participants that represent the real world
Whether you’re running focus groups for your pricing strategy or conducting usability testing for a new product, user interviews are one of the most effective research methods to get the needle-moving insights you need. But to discover meaningful data that helps you reach your goals, you need to connect with high-quality participants. This article shares five tips to help you optimize your recruiting efforts and find the right people for any type of research study.
Hotjar team
User research
How to instantly transcribe user interviews—and swiftly unlock actionable insights
After the thrill of a successful user interview, the chore of transcribing dialogue can feel like the ultimate anticlimax. Putting spoken words in writing takes several precious hours—time better invested in sharing your findings with your team or boss.
But the fact remains: you need a clear and accurate user interview transcript to analyze and report data effectively. Enter automatic transcription. This process instantly transcribes recorded dialogue in real time without human help. It ensures data integrity (and preserves your sanity), enabling you to unlock valuable insights in your research.
Shadz Loresco
User research
An 8-step guide to conducting empathetic (and insightful) customer interviews in mid-market companies
Customer interviews uncover your ideal users’ challenges and needs in their own words, providing in-depth customer experience insights that inform product development, new features, and decision-making. But to get the most out of your interviews, you need to approach them with empathy. This article explains how to conduct accessible, inclusive, and—above all—insightful interviews to create a smooth (and enjoyable!) process for you and your participants.
Hotjar team