How we measure and improve our users’ experience year-round

September 26, 2018 by David Peralta

When we first launched Hotjar, success was anything but guaranteed.

We were completely unknown, didn’t have a cent of outside funding, and were facing some well-known competitors.

Four years later, we’ve gone from €0 to €15 million in Annual Recurring Revenue and Hotjar has been installed on over 670,000 sites. 


A huge factor in reaching this level of growth has been a strong belief that whoever gets closer to the customer wins.

blog-21a-quote

In this article, we want to share with you the exact methods we use to stay in touch with our customers year-round at Hotjar, and how we use that feedback to constantly improve their user experience.


✏️Note: even though we are a Software-as-a-Service (SaaS) company, the techniques we cover in this post are highly applicable to e-commerce and lead generation businesses.

Table of Contents

It starts with obsessing about our users

For us, obsessing about our users isn’t just an empty phrase.

It’s what drives everything we do at Hotjar.

It’s what leads our product team to go through every single piece of incoming customer feedback on a monthly basis to spot trends and make improvements.


It’s why we chose to double-down on our commitment to privacy and allow our users to anonymize Personally Identifiable Information (PII) in their Recordings.


It’s why our CEO David Darmanin sat down with a glass of wine and a box of Belgian chocolate to read and analyze over 3,000 open-ended responses to our NPS survey in one evening.


And it’s what led our Director of Customer Experience to take a customer’s request for a hand-drawn dinosaur seriously and send this reply:

image5-1

 

It’s a huge investment of time, for sure. But it’s the most important one we could possibly make.


So what follows are the exact steps we take at Hotjar to get as close as possible to our customers to WOW them and encourage them to become our biggest evangelists.

The 6 ways we collect user feedback at Hotjar

It breaks down into three stages with six touch points total:

blog-21a-touchpoints

The Proactive touchpoints are where we go out of our way to find out how our users are doing early on in their experience with Hotjar and discover what we could be doing better.


The Reactive touchpoints are triggered by a specific interaction with us, such as a support ticket or when someone downgrades their account.


Finally, On Demand is how we give our users the opportunity to reach out at any point in their experience with us.


In this article, we’ll be focusing on the Proactive touchpoints; Reactive and On Demand will come in Part 2.


For all three stages, our main focus here is on the qualitative side of the experience (i.e. what our customers experience, WOW moments, and frustrations were in their own words) as opposed to quantitative side (measuring usage of the product, how long did our users stay with us, etc.).


Let’s start from the top:

1. Point-of-Conversion Survey

  • When: Within seconds of purchase
  • Where: On page
  • How: Hotjar script tag or page-specific poll

We show the post-purchase survey seconds after our users sign up for a paid plan: blog-21a-question-flow-01

This is the moment when the credit card is still in their hand and the experience is as fresh as possible.


It helps us uncover direct insights for improving the experience. And since our customer just converted, these responses are as qualified as you can get.


And, most powerfully, we use Hotjar to then view recordings of people who had a bad experience.

Blog_post_21a_-_Screenshot_1-min

Watching session recordings of users who rated having a negative experience is one of the ultimate forms of customer empathy

The point-of-conversion survey allows us to uncover the exact negative experiences that almost stopped our customers from purchasing. By fixing these issues, we’re able to make the experience smoother and more positive for everyone else.

 

david-1

 

 

David Darmanin - CEO at Hotjar

✏️NOTE: For more info on the exact questions we ask and how you can set up your own point-of-conversion survey, check out this post.

2. Customer Effort Score (CES)

  • When: 1-2 weeks after signup and the customers has started using key features
  • Where: On-page
  • How: Hotjar Script tag or page-specific poll

Customer Effort Score (CES) measures how much effort it took to complete a task on a scale of 1-5 or 1-7.


CES is often used to measure support interactions, but we use it slightly differently at Hotjar.


The very first time that users create a new Heatmap, Recording, Poll, or Survey, we ask them:

blog-21a-question-flow-02


We set the poll to only show up just once, regardless of whether they answer or not.


Here are the results from our CES:

Blog_Post_21a_-_Screenshot_2-min

1 means customers feel Hotjar is difficult to use, 7 means They feel it’s easy to use

 

At first glance, the results look good, right?


They’re stepping up, with 47% of our customers giving the highest score possible.


But here’s how we look at it:

Blog_Post_21a_-_Screenshot_3-min


Almost 50% of our customers gave us a 4, 5, or 6. That’s half of our customers not having an optimal experience.


Now you might ask, “wait a minute, why aren’t you looking at the 1s, 2s, and 3s?”


Because they only make up around 3% of responses. That means we’re doing a pretty good job of not having a terrible experience. And the overall impact of helping them would be minimal compared to bumping more people up to a 6 or a 7.


Again, session recordings play a super valuable role, for example, to view sessions where a user said using Heatmaps was difficult:

Blog_Post_21a_-_Screenshot_4-min

We’ve also gotten feedback through our CES survey that has allowed us to build an even better Hotjar:

blog-21a-feedback

Actual feedback we've gotten from our users via our CES surveys
 

At one point, we did notice a trend across lower CES scores that caught our attention.  It turned out that quite a number of our customers were complaining about their experience with Hotjar’s Recordings.


So our product team started having 1-on-1 interviews with these customers. That helped our team map out the experience a typical customer might have while using Recordings:

Blog_Post_21a_-_Screenshot_5-min


That’s when we realized there were more potentially negative experiences than positive ones with Recordings

 

So we added better filtering and segmentation of our users’ recordings as a direct result – all thanks to the responses from our CES survey.

3. Net Promoter Score® (NPS)

  • When: 30 days after signing up for a free trial and 15 days after becoming a paid user, repeated every year
  • Where: Hotjar dashboard
  • How: Hotjar poll/script tag/customer attribute

After our customers already converted, have been onboarded, and are starting to use the product more and more, we dive deeper into the experience using a Net Promoter Score (NPS®) survey.

blog-21a-question-flow-03


It’s a great way to keep a pulse on what our customers love about Hotjar and what needs to be improved.


The goal with NPS is to learn, will our customers actually recommend us? And if not, what’s holding them back?


Since referrals and word-of-mouth account for 40% of new signups for us, getting the answers to these questions is essential to our success.


We calculate the score and process the responses once a month, then present the results to the whole team.

Blog_Post_21a_-_Screenshot_6-min


But for us, the most valuable part of NPS isn’t the score.


It’s the responses we get to the follow-up questions:

Blog_Post_21a_-_Screenshot_7-min


That’s where we learn what expectations people have of Hotjar that we aren’t meeting and how we can improve them. We hear from people who want to love Hotjar, but something’s getting in their way.


In fact, here we have two customers literally telling us what we should do to make them love us:

Blog_Post_21a_-_Screenshot_8-min


That kind of feedback is absolute gold for building a customer-centric product.


And the NPS results definitely influence our day-to-day decisions...


For example, our CEO David Darmanin recently had a call with our product team about the plan to update Recordings. He wasn’t happy with it because it was getting too complicated, so he pointed to our NPS results and said:

Simplicity and usability are one of our main selling points. We cannot make this complicated. If it’s not dead simple, then we’re moving in the wrong direction. 

As always, Recordings provide a priceless way to view the actual experience people are having, which in our minds is the ultimate form of empathy with our users.


Out of the first three touch points, if there’s only one you can do, it’s NPS. It’s the most qualified, with the people who have stuck around, and it really helps you understand what to improve to increase your referrals. This is really, truly the most interesting one.

david-1

 

 

David Darmanin - CEO at Hotjar


✏️NOTEWe wrote an entire post on how to create an NPS survey for your site.

How do we analyze all the responses we get?

Since all of these surveys contain several open-ended questions, they generate a lot of answers that we need to go through.  


So how do we analyze all those replies?


By hand!


(Well, by Excel to be more accurate).


We see manually going through every response as just part of the process of getting as close to our customers as possible. It’s not something we would ever recommend automating or outsourcing.

Luckily, we recently published a post about exactly how we analyze and review open-ended questions at Hotjar. (And don’t forget the chocolate!)

4. Customer Satisfaction Survey (CSAT)

  • When: When closing a customer support ticket or at the end of a customer success call
  • Where: Email
  • How: Zendesk and Go-to-Webinar addons

At Hotjar, we’ve always believed that the service we provide on top of the product we sell is just as important as the product itself.


Maybe even more important.

Because with a digital product the service is the only human element you have.


That’s why we strive to create as many WOW  moments during our customer interactions as possible:


Screen Shot 2018-10-01 at 2.19.44 PM

upload_png__622×764_-min

 

And since these positive interactions often lead to word-of-mouth referrals, the effort really pays off.


Which is why it’s critical that we understand how well our service interactions with our customers are going. And for that, we use a CSAT survey.


24 hours after support ticket has been solved (or after a customer success call), we send an email using Zendesk or Go-to-Webinar asking our customers to rate their experience with our support team:

blog-21-Question-Flow-4-min

OUR CSAT SURVEY FLOW 

We track the score on a weekly basis and monitor actual customer responses that stood out:

 

Hero_Demo_2018_09_28_-_Google_Slides-min

A slide from the Customer Experience team’s weekly presentation to the rest of the Hotjar team

As with the previous touchpoints, it’s not about the score so much as the follow-up responses. And for CSAT, we make sure to make the link to the feedback we’re getting from the other touchpoints.


For example:


Our average response time might hit 25 hours or more in a given week. Not terrible, but certainly not great. But the CSAT score for the week could still be 98%. And since our Net Promoter Score® isn’t pointing to any big problems with the response time, we know we don’t need to double-down on and improve it at the moment.

Understanding these customer satisfaction metrics also allows us to discover where we need to improve things. What are the bottlenecks? What are the barriers on our side that are keeping us from providing even better customer support experience?

Keeping a finger on the pulse of our customer interactions using CSAT lets us track whether the changes we’re making are having a positive impact or not.

 

david-1

 

 

David Darmanin - CEO at Hotjar

Pro Tip: We also use the CSAT format on our help articles and blog posts. At the end of the article, there's a yes or no question asking if the article was helpful:


blog-21-Question-Flow-4a-min

If you click "No," it loads up a poll that says, "Sorry you didn't find what you're looking for. How can we make this Help section useful?" 

It’s great for optimizing our help documentation and encouraging our users to self-service as much as possible.

5. Retention Survey

  • When: Immediately when a customer downgrades
  • Where: Hotjar dashboard
  • How: Embedded survey

Having a retention survey (aka a survey that asks why a customer downgraded their paid plan) is something we’ve found to be critical to improving our users’ experience.


Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides

The survey we show to all customers who downgrade

In fact, we’ve learned so much from this survey that we made it a requirement to fill out before a customer downgrades.


The results are so valuable because there are times where the number of customers downgrading is going up or down. Without this survey, we would have no clue why.


And it’s powerful to uncover the reasons in our customers' own words:

blog-21b-feedback

Actual customer feedback we've received from our retention survey

In the above responses, our customers are telling us exactly how they’re using Hotjar in the real world and what we need to do make it likelier that they’ll stick around longer.


We also make sure to ask downgrading customers whether they are likely to upgrade again in the future:

Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-1

 

This lets our product team give more weight to feedback from people who are likely to come back. It also plays a big part in defining our product roadmap and deciding on what we’ll build next.

In our case, what was really interesting with the retention survey was that 61% of the answers were from dormant users, or users who love Hotjar but just aren’t using it right now and will reactivate at another point.

That was a big surprise for us, and tells us what a huge potential we have in helping our users get more value out of Hotjar on an ongoing basis rather than just project-to-project.

 

david-1

 

 

David Darmanin - CEO at Hotjar

6. On-going User Feedback

  • When: Always on
  • Where: In-app everywhere
  • How: Hotjar’s Incoming Feedback

This last one is the catch-all. It lets our customers give us feedback whenever and wherever they want, on demand.


The real value of this type of feedback is how in-the-moment it is—especially when it’s coming from qualified customers.

After all, we know from our own negative experiences using websites that when something pisses us off, we don’t want to fill out a form or open a support ticket. We want to let someone know right away.


And if we can’t, we leave. Probably upset and not very likely to recommend the company.


We definitely don’t want our customers doing that, which is why we use Incoming Feedback – a small widget that lets users immediately give feedback at any point in their experience:

 

blog-21-Question-Flow-6

Incoming FEEDBACK in action
 

The responses we’ve gotten from Incoming Feedback have been critical in helping us uncover what we need to fix right away to keep our customers happy.


For example, during one of our product team’s monthly investigations, they spotted feedback from users having problems with Heatmaps not showing up:

 

Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-3

Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-5

The big black square is where a heatmap should be
 

So they dug into the other feedback touchpoints to see what was going on. They found support tickets of people giving negative CSAT scores and complaints in our NPS survey.


Luckily, they were able to respond quickly because they had clear examples from Incoming Feedback (and Recordings) showing the exact problem users were having.


Our support team also combs through the responses and replies directly to people having issues:

 

Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-6

INCOMING FEEDBACK Responses
 

We also use Incoming Feedback track the overall user experience, with all its ups and downs:

 

Eating_Our_Own_Dog_Food_– How_Hotjar_measures_experience_all_year_round_-_Google_Slides-7

INCOMING FEEDBACK Results

Incoming Feedback is really important for us at Hotjar, especially since we’re making changes often. It’s good to move fast and break things often, but you have to allow people to tell you that you’ve broken something.

That's all you need, and you have to act on that.

 

david-1

 

 

David Darmanin - CEO at Hotjar

In the end, you have to care deeply about your users

So, to recap…


The six methods we use at Hotjar to measure and improve our users’ experience, year-round, are:


  1. Point-of-Conversion Survey
  2. Customer Effort Score (CES)
  3. Net Promoter Score (NPS)
  4. Customer Satisfaction Survey (CSAT)
  5. Retention Survey
  6. On-Going User Feedback

But even tracking the responses from each one touchpoint wouldn’t mean anything if we didn’t care enough about our users to act on the feedback and insights we collect.

 

New call-to-action

(Net Promoter, Net Promoter System, Net Promoter Score, NPS and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld and Satmetrix Systems, Inc.)

Join 10,000+ marketers and designers who receive our blog posts in their inbox

David Peralta

As Hotjar's Outreach Marketer, David is obsessed with helping others succeed by putting people first. He also loves a good walk in the Redwood grove near his home in Mendocino County, CA.

comments powered by Disqus
Back to top