How to collect and measure user feedback year-round: examples and tools we use

March 5, 2019 by Hotjar team

When we first launched Hotjar, success was anything but guaranteed. We were completely unknown, didn’t have a cent of outside funding, and were facing some well-known competitors.

Five years later, we’ve gone from €0 to €17 million in Annual Recurring Revenue and Hotjar has been installed on over 670,000 sites. And a huge factor in reaching this level of growth has been our commitment to staying close to our customers.

In this article, we share the exact methods we use to stay in touch with our customers year-round, and how we collect and measure user feedback to constantly improve our website and product experience.



✏️ Note: we are a Software-as-a-Service (SaaS) company, but the techniques we cover in this post are highly applicable to your own e-commerce and/or lead generation business.

Table of Contents

What is user feedback? 

User feedback is information collected from users/customers about their reactions to a product, service, or website experience. Feedback and insight from customers and website visitors is used by UX designers, researchers, and marketers to improve the user experience.

Why user feedback is so important at Hotjar

For us, getting close to our users and even 'obsessing' about them isn’t just an empty phrase: it’s what drives everything we do.



  • And it’s what led our Director of Customer Experience to take a customer’s request for a hand-drawn dinosaur seriously and send this reply:

image5-1


It’s a huge investment of time, for sure—but it’s the most important one we could possibly make for the long-term growth of Hotjar.

6 types of user feedback (and our favorite tools to collect it) 

The way we collect user feedback breaks down into three stages with six touch points total:

blog-21a-touchpoints

  1. The proactive touch points are where we go out of our way to find out how our users are doing early on in their experience with Hotjar and discover what we could be doing better

  2. The reactive touch points are triggered by a specific interaction with us, such as a support ticket or when someone downgrades their account

  3.  On demand is how we give our users the opportunity to reach out at any point in their experience with us


For all three stages, our main focus is on the
qualitative side of the experience (i.e. what our customers' experience, WOW moments, and frustrations were in their own words) as opposed to the quantitative one (measuring usage of the product, how long did our users stay with us, etc.).


Here is a breakdown:

1. Point-of-conversion survey

  • When: within seconds of purchase
  • Where: on page
  • How: Hotjar script tag or page-specific poll

We show the post-purchase survey seconds after our users sign up for a paid plan. We ask people how their payment experience was on a scale from 1 (hated it) to 5 (loved it) when the credit card is still in their hands and the experience as fresh as possible.   blog-21a-question-flow-01

This type of survey helps us uncover direct insights for improving the experience—and since our customer just converted, these responses are as qualified as you can get.


Most powerfully, we can use our own Recordings tool to watch session recordings of people who had a bad experience.

Blog_post_21a_-_Screenshot_1-min

Watching session recordings of users who HAD a negative experience
is one of the ultimate forms of customer empathy

The point-of-conversion survey allows us to uncover the exact negative experiences that almost stopped our customers from purchasing. By fixing these issues, we’re able to make the experience smoother and more positive for everyone else. 

david-1

 

 

David Darmanin - CEO at Hotjar

✏️ Read more in-depth details about the exact questions we ask + learn how to set up your own point-of-conversion survey.

2. Customer Effort Score (CES)

  • When: 1-2 weeks after signup and the customers has started using key features
  • Where: on-page survey
  • How: Hotjar script tag or page-specific poll

Customer Effort Score (CES) measures how much effort it took to complete a task on a scale of 1-5 or 1-7. It's often used to measure support interactions, but at Hotjar we use it to measure our tool's ease of use.


The very first time a user creates a new Heatmap, Recording, Poll, or Survey, we ask them to what extent they agree that Hotjar is easy to use, on a 1-7 scale:

blog-21a-question-flow-02


We set the poll to only show up just once, regardless of whether the user answers or not.


Here are the results from our CES:

Blog_Post_21a_-_Screenshot_2-min

1 = Hotjar is difficult to use, 7 = HOTJAR is easy to USE

 
At first glance, the results look good, with 47% of our customers giving the highest score possible.


But here’s how we look at it:

Blog_Post_21a_-_Screenshot_3-min


Almost 50% of our customers gave us a 4, 5, or 6. That’s half of our customers not having an optimal experience—and a huge opportunity for us to improve it for them. 

 


(Side note: the 1s, 2s, and 3s only make up around 3% of responses, which means we’re doing a pretty good job of not having a terrible experience. And the overall impact of helping them would be minimal compared to bumping more people up to a 6 or a 7.)


Again, session recordings play a super valuable role, for example, to view sessions where a user said using Heatmaps was difficult.

We’ve also gotten feedback through our CES survey that has allowed us to build an even better Hotjar:

blog-21a-feedback

Actual feedback we've gotten from our users via our CES surveys
 


Practical example:
At one point, we noticed a trend across lower CES scores that caught our attention: quite a number of our customers were complaining about their experience with Hotjar’s Recordings.


So our product team started having 1-on-1 interviews with these customers (and that's another type of user feedback for you!). That helped the team map out the experience a typical customer might have while using Recordings:

Blog_Post_21a_-_Screenshot_5-min


A CUSTOMER'S POTENTIAL EXPERIENCE WITH HOTJAR RECORDINGS


Through this mapping exercise, we realized there were more potentially negative experiences than positive ones with the tool. So we added better filtering and segmentation of our users’ recordings as a direct result—all thanks to the responses from our CES survey.

3. Net Promoter Score® (NPS)

  • When: 30 days after signing up for a free trial and 15 days after becoming a paid user, repeated every year
  • Where: Hotjar dashboard
  • How: Hotjar poll/script tag/customer attribute

After our customers already converted, have been onboarded, and are starting to use the product more and more, we dive deeper into their experience using a Net Promoter Score (NPS®) survey.

blog-21a-question-flow-03


The goal with NPS is to learn whether our customers would actually recommend us—and if not, what’s holding them back. Since referrals and word-of-mouth account for 40% of new signups for us, getting the answers to these questions is essential to our success. 


We calculate the score and process the responses once a month, then present the results to the whole team.

Blog_Post_21a_-_Screenshot_6-min



But for us, the most valuable part of NPS
isn’t the score: it's the responses we get to the follow-up questions, where we ask people what we should do to improve their experience. That's where we get detailed, actionable answers that end up impacting our product development roadmap: 

NPS survey example

 

NPS survey examples


 

An NPS survey helps us learn what expectations people have of Hotjar that we aren’t meeting and how we can improve. That kind of feedback is absolute gold for building a customer-centric product.

 

Practical example: 
In 2018, our Product team had come up with a plan to update one of our tools. Our CEO believed that the plan was becoming too complicated—but rather than voicing his personal opinion, he pointed to our NPS results which consistently prove that simplicity and usability are one of our main selling points.
It was as simple as that—the team, empowered by customer feedback, went back to the drawing board. 

Out of the three proactive touch points—Point-of-conversion, Customer Effort Score, or NPS—if there’s only one you can do, it’s NPS. It’s the most qualified, with the people who have stuck around, and it really helps you understand what to improve to increase your referrals. This is really, truly the most interesting one.

david-1

 

 

David Darmanin - CEO at Hotjar


✏️ Read more: we wrote an entire guide to Net Promoter Score if you want to find out more about where and how to implement it. 

4. Customer Satisfaction Survey (CSAT)

 

  • When: when closing a customer support ticket or at the end of a customer success call
  • Where: email
  • How: Zendesk and Go-to-Webinar add-ons

At Hotjar, we’ve always believed that the service we provide on top of the product we sell is just as important as the product itself—maybe even more, because with a digital product the service is the only human element you have.


That’s why we strive to create as many WOW  moments as possible during our customer interactions:


Screen Shot 2018-10-01 at 2.19.44 PM

upload_png__622×764_-min

 

And since these positive interactions often lead to word-of-mouth referrals, the effort really pays off. So it’s critical that we understand how well our service interactions with our customers are going: and for that, we use a CSAT survey.


24 hours after a support ticket has been solved or after a customer success call, we send an email using Zendesk or Go-to-Webinar asking our customers to rate their experience with our support team:

blog-21-Question-Flow-4-min

OUR CSAT SURVEY FLOW 


We track the score on a weekly basis and note down actual responses that stood out:

Hero_Demo_2018_09_28_-_Google_Slides-min

A slide from the SUPPORT team’s weekly REPORT

As with the previous touchpoints, it’s not about the score so much as the follow-up responses. And for CSAT, we make sure to make the link to the feedback we’re getting from the other touchpoints.



Practical example: 
Our average response time might hit 25 hours or more in a given week. Not terrible, but certainly not great. But the CSAT score for the week could still be 98%. And since our NPS feedback isn’t pointing to any big problems with the response time, we know we don’t need to double-down on and improve it right now moment.

Understanding these customer satisfaction metrics also allows us to discover where we need to improve things. What are the bottlenecks? What are the barriers on our side that are keeping us from providing even better customer support experience?

Keeping a finger on the pulse of our customer interactions using CSAT lets us track whether the changes we’re making are having a positive impact or not. 

david-1

 

 

David Darmanin - CEO at Hotjar

We also use the CSAT format on our help articles. At the end of each, there's a yes or no question asking if the article was helpful:

blog-21-Question-Flow-4a-minIf people click "No," an on-page survey pops up to ask "How can we make this Help section useful?" The answers are great for optimizing our documentation and encouraging users to self-service as much as possible.

5. Retention survey

  • When: immediately when a customer downgrades
  • Where: Hotjar dashboard
  • How: embedded survey

Having a retention survey (aka a survey that asks why a customer downgraded their paid plan) is something we’ve found to be critical, especially at times when the number of customers downgrading is going up or down and we don't have a real clue why. 


Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides

The survey we show to all customers who downgrade

In fact, we’ve learned so much from this survey that we made it a requirement to fill out before a customer downgrades. It’s powerful to uncover the reasons in our customers' own words, and understand what we need to do make it likelier that they’ll stick around longer next time.

blog-21b-feedback

Actual customer feedback we've received from our retention survey


We also make sure to ask downgrading customers whether they are likely to upgrade again in the future:

Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-1

 

This lets our product team give more weight to feedback from people who are likely to come back. It also plays a big part in defining our product roadmap and deciding on what we’ll build next.

What was really interesting with our retention survey was that 61% of the answers were from dormant users, or users who love Hotjar but just aren’t using it right now and will reactivate at another point.

That was a big surprise for us, and tells us what a huge potential we have in helping our users get more value out of Hotjar on an ongoing basis rather than just project-to-project.

david-1

 

 


David Darmanin - CEO at Hotjar

6. On-going user feedback

  • When: always on
  • Where: in-app everywhere
  • How: Hotjar’s Incoming Feedback

This last one is the catch-all that lets our customers give us feedback whenever and wherever they want. Its real value is how in-the-moment it is—especially when it’s coming from qualified customers.

We know from our own negative experiences using websites that when something annoys us, we don’t want to fill out a form or open a support ticket. We want to let someone know right away. And if we can’t, we leave. Probably upset, and not very likely to recommend the company.


We definitely don’t want our customers to do that, which is why we use Incoming Feedback—a small widget that lets users immediately give feedback at any point in their experience:

blog-21-Question-Flow-6

Incoming FEEDBACK in action

The responses we’ve gotten from Incoming Feedback have been critical in helping us uncover what we need to fix right away to keep our customers happy.



Practical example: 
During one of
our product team’s monthly investigations, they spotted negative feedback from users whose heat maps were not showing up correctly: 

Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-3


Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-5

The big black square is where a heatmap should be

This alerted the team to a possible issue. They quickly dug into the other feedback touch points, and found support tickets of people giving negative CSAT scores and complaints in the NPS survey. Watching recordings further confirmed the issue—so they jumped on it to solve it as quickly as possible. 

Our support team also combs through the responses and replies directly to people having issues:

 

Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-6

INCOMING FEEDBACK Responses

We also use Incoming Feedback track the overall user experience, with all its ups and downs:

Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-7

INCOMING FEEDBACK Results

Incoming Feedback is really important for us at Hotjar, especially since we’re making changes often. It’s good to move fast and break things often, but you have to allow people to tell you that you’ve broken something.

That's all you need, and you have to act on that. 

david-1

 

 

David Darmanin - CEO at Hotjar
 

Get started: 5 best questions to ask for effective user feedback

As you've seen so far, we use six methods to measure and improve our users’ experience year-round, but you don't have to use them all right away—in fact, you can get started on a much smaller scale to find out: 


  • what’s bringing people to your website
  • what’s stopping them converting
  • why are some users converting.

You can find out this critical information by setting up an on-page survey with Hotjar and asking your visitors a handful of carefully-chosen questions.

1. How can we make this page better?

05-Blog41-Poll1

Where to ask: any business-critical website page, especially pages with high exit rates

What you’ll get: a list of problems you may not have know you had, with potential solutions directly suggested by your visitors.

2. Where did you first hear about us?

05-Blog41-Poll2

Where to ask: homepage, landing page

What you’ll get: better insight into where customers hear about you that traditional analytics can’t track (especially word-of-mouth referrals or offline campaigns).

3. Why are you looking for [product or service] today?

05-Blog41-Poll3

Where to ask: product pages, pricing page

What you’ll get: a clearer understanding of why customers are interested in your product/service. You also get to collect feedback in your users’ own words that you can re-use in your product marketing copy.

4. What, if anything, is stopping you from [taking action] today?

05-Blog41-Poll4

When to ask: exit survey when users are leaving site, abandoned cart follow-up

What you’ll get: understanding of technical issues with your purchase process and which objections you need to better address on the page.

5. What persuaded you to [take action] today?

05-Blog41-Poll6

Where to ask: post-purchase survey or on a success page

What you’ll get: your unique selling points (USPs) in your users’ words when the experience is fresh in their minds.

🏆 Pro tip: the 5 examples above are open-ended questions, which can be answered in depth and allow for original, unique, and potentially lengthy responses.
We recommend going through every single response 
manually (it's time intensive, but really helps you empathize with your customers) and here is a handy guide to help you analyze answers to open-ended question with a simple spreadsheet.   


Start collecting user feedback with Hotjar    

Get the feedback you need to grow your business and make your customers happy

Free forever. Get started! 
CRO-plan

 

* * *

One final thought: 

Tracking the responses from each touch point won't mean much if you don't act on the feedback and insights. Building a truly customer-centric business is not about how much feedback you collect—it's what you do with it that counts. 

blog-21a-quote

Join 10,000+ marketers and designers who receive our blog posts in their inbox

comments powered by Disqus
Back to top