Learn / Blog / Article

Back to blog

1 year into NPS: the good, the bad, the ugly of getting our users’ feedback

What changes can we make that will have the most positive impact on our users and customers? Why are some of them leaving and never coming back?

These are two very important questions Hotjar needs to answer to continue to grow, and in order to answer them, we need user feedback.

One way to collect it is to ask the Net Promoter Score® (NPS) question:

“How likely are you to recommend Hotjar to a friend or colleague?” on a scale of zero to 10 (with 10 being extremely likely).

Last updated

31 Oct 2022

Reading time

11 min

Share

NPS is calculated by subtracting the percentage of respondents who are Detractors from the percentage who are Promoter. 

We like NPS because it gets a ‘gut’ response from people in a way that more complicated surveys don’t.

What follows is the good, the bad, and even the ugly of getting our users’ feedback using NPS.

Part 1: it took us two long years to set our NPS system up

We released the NPS software feature in March of 2015, allowing Hotjar users to ask NPS questions in both on- and off-site surveys:

THE EMAIL UPDATE WE SENT TO OUT USERS IN MARCH 2015.

As of May 1st, 2018, our users have created a total of 51,797 NPS polls like this one.

And we launched our first poll to continuously survey our users in April of 2017 (we used to ask the NPS question only periodically in surveys before).

What took us so long?

No one was assigned to do it.

That’s the honest answer: there wasn’t a clear ‘owner’ of NPS.

So it slid by the wayside and got pushed down the To Do list by more pressing (and clearly assigned) concerns.

This is a lesson we’ve learned at Hotjar over and over again: if you want something done, someone has to own the task.

Or, as our Director of Demand Generation Jon Malmberg puts it:

This is what we have learned in Hotjar the hard way: if you don’t have your priorities straight, then things on the back-burner will never ever happen. Ever. It will never happen because, always, there’s gonna be something coming up that you might think has a higher priority.”

It was Jon who finally took the lead and set up our first NPS survey.

Fast-forward to today and we’ve hired our Director of Customer Experience, Emily Sergent. She owns NPS and actually worked on developing a plan to improve our NPS during her pre-hiring task.

Our current global customer Net Promoter Score is:

Which is good.

Especially considering how far we’ve come.

Part 2: NPS is great but it’s just a number...

To be clear, NPS is great for many reasons, but it does not do what it says on the proverbial box.

The way the NPS question is worded makes it seem like a predictor of future behavior. After all, it’s asking “How likely are you to recommend Hotjar to a friend or colleague?”.

But studies show that people are really bad at predicting their own future behavior, so NPS doesn’t work that way. Which isn’t to say it isn’t useful.

You just have to keep in mind:

  • It’s just a benchmark. A number. An indication of whether you’re on the right track or not.

  • The value of the score is nonlinear because people’s perceptions of their own happiness are non-linear. Economists have had this figured out for a long time - it’s called Diminishing Marginal Utility: The more you consume (or use) a single product, the less satisfied you’re likely to be with it. Add to that the wildcard of people who simply don’t believe in giving 10s, or people who are ‘too nice’ to give low scores, and you have a considerably sliding scale.

But didn’t we just say how useful NPS is? Yes. And it is. As long as you understand its limitations.

NPS may not be a predictor of referrals, but it can help you to find good indicators of churn (for example, by analyzing negative responses - more on that in the next section). It helps you see trouble coming, sort of like sonar on a boat. You can spot the iceberg in time to course-correct, at the individual user level and on the ‘macro’ level of what affects a number of users.

NPS helps you see trouble coming, just like a sonar on a boat helps you spot an iceberg on your route. Photo by Danting Zhu on Unsplash.

It helps - a lot - to have a more comprehensive picture than a 1-10 user rating. For that, we decided to ask follow-up questions.

Part 3: so we decided to ask follow-up questions...

Once Jon took the lead to set up our first NPS surveys, he researched which follow-up questions to ask.

He set up our survey to show different follow-up questions depending on how users respond.

Detractors: if users score less than 7, they are considered “detractors,” and we can ask: “What can we do to improve Hotjar - and your score?

Passives (scores of 7 and 8) get the same question.

Promoters, those who score 9 and 10, see this response: “We’re thrilled you feel that way. What’s the main reason for your score?

And then, to everyone, we ask:

If we could do anything, what could we do to WOW you?

That's because we believe that our product should create WOW moments for our users: going above and beyond what’s expected of us.

Finally, we ask if we can follow up or ask more questions via email but never really followed up...

Learn why your visitors aren’t converting

Hotjar shows you what keeps your visitors from buying, so you can make website changes based on real insights, not assumptions, and watch your conversion rate grow.

Part 4: but closing the loop - we haven’t been

When Hotjar was in Beta and we were deciding what to build and work on next, we sorted all of our user feedback by category in a giant Trello board. This was well before we started using NPS, but we gathered user feedback and it was all there.

ONE OF THE TRELLO BOARDS WE USED TO COLLECT AND CATEGORIZE FEEDBACK FROM USERS.

We also asked users if we could follow up with them and ask more questions, and every time they said “yes”, we’d make that a comment on their cards. Then, when we set up a new feature or fix a bug, we would reach out to those people in particular.

As we grew, we stopped reaching out to our users.

We stopped creating WOW moments.

That’s the bad.

Now that we have Emily Sergent with us to take ownership of improving NPS, we’re coming back to our roots.

Like closing the loop.

For example, a couple of weeks ago, one of our NPS respondents included a special request in their open-ended feedback. They asked if they could have a dinosaur sent back to them.

So we asked our designer, Denis, to sketch out something quick…

This is the response we got back from said NPS respondent:

That’s a WOW moment in our book.

We’re trying to do more of this. Not only to close the loop but to close it in a more personal way. In fact, the entire process should be more personal - as in, coming from a person.

As Emily says,

NPS surveys should come from an individual and include language indicating that all responses are read (and the team should read them all). Also, I think it’s important to ask the ‘Why?’ follow-up question, which can give you great actionable insights.

You can’t expect high participation rates if you’re presenting your survey as some automated thing that no one monitors.

But, make it fun, make it personal, and make sure everyone feels heard, and your participation rates will rise - and the feedback you receive will probably improve in terms of quality.

Users want to tell you what they think. But only if you’re listening.

Lesson learned: Close the loop and make it personal (and fun).

Part 5: we had to learn to consider the source

One of the mistakes we made early on was to show our NPS survey to our free users and our paying customers and lump all of their responses in together.

After a few weeks, we realized that our paying customers had markedly different views of the product because they were paying for it. They had higher expectations, they used the product more (and Marginal Utility is real), and their scores were about 10 points lower as a result.

That was an easy fix.

We split the NPS survey in two, one for free users and one for paying customers, which made the ratings much more informative.

Free users average a score of 54:

Paying customers average 44:

Lesson learned: Different populations can have different expectations. Look for trends, and see if it makes sense to split up the answers for a clearer picture.

Part 6: timing is everything

Now for the ugly part.

If you look at our graphs from our first year of using NPS, you might think Hotjar is in trouble:

I thought so too.

Those numbers are bad - but they aren’t indicative of Hotjar’s popularity.

They are indicative of just how bad we were at implementing NPS.

We had the timing all wrong.

Initially, when we set up the NPS survey, we sent it to every single user. At the same time. And a lot of people answered.

And then those users were never shown the survey again.

Then we’d send the NPS survey to new users within the first 24 hours of sign-up (which, we realize now, is insane).

The new users never even had a chance to really experience what Hotjar does before we asked them if they’d recommend us.

That’s like some random person at a cocktail party shaking your hand and immediately asking “Hey, can you put in a good word for me with your boss?

No! That’s way too soon. And our numbers showed that.

Predictably, we got a lot of low scores this way. And we found that 20 percent of Detractors and 16 percent of Passives gave their low NPS rating because they felt it was too soon to be asked.

They were right, of course. The big hint here was all the responses that came back saying...

Or...

Or...

Lesson learned: Give people enough time to get to know your product before you ask them how they like it.

To correct for this, we decided to show the NPS survey once for each user, two weeks after sign-up.

Better, right?

Not really.

The problem was: We have a 14-day trial. We didn’t know whether those two-week-old users would become paying customers or not, and we’d already found that we needed to separate the responses of those two populations.

And, let’s not forget, after just two weeks they were still all *new* users.

Not the best plan.

So we revamped by setting up a script that would run at least two weeks after users signed up, but only once.

Basically, when someone opens their account, we check the ‘age’ of their usage and if it’s older than 14 days, we run a trigger. We have two surveys set up, one for free users and one for customers.

But we still haven’t quite found the sweet spot in terms of timing.

We do know that the first NPS survey needs to deploy after the trial has ended, and we’re thinking of sending subsequent NPS surveys every 6 months after that - because one-and-done does not a benchmark make.

Part 7: other improvements in the works

We have ambitious plans for improving our NPS scores - and the way we handle them.

We’re working on creating a review process with a specific group of people responsible for reading our NPS feedback daily, categorizing them, segmenting them, and flagging the ones that need replies.

Once a month, we’ll analyze the data for trends, conduct a deeper segmentation analysis to see if certain customer types have the same concerns, share the findings with key stakeholders (ie. the respective departments the feedback is about) and, finally, check back to see if our improvements have actually affected the score.

We’re also planning to exploit one of NPS’s best features - as a leading indicator of churn - to spot high-value clients who indicate they may be close to canceling and resolve their issues and find those who could use some additional coaching to find success with the product.

And, those NPS responses can also highlight our biggest fans who might be willing to share their positive experiences with a wider audience.

This is where NPS’s value can soar.

Those scores reveal opportunities to help customers in trouble, coach customers who aren’t getting the most from your product and encourage your fans to become vocal advocates.

Key takeaways

  1. NPS needs an ‘owner’ to manage it

  2. NPS is a good ‘leading indicator’ of churn but does not give you the full picture. Try to include open-ended follow-up questions.

  3. Close the loop and make it personal, not just to delight your customer, but to ensure their success (and spot opportunities)

  4. Different populations may feel differently about your product, so keep an eye out for population-specific trends

  5. Timing matters. Give users time to experience your product before asking how they like it.

--

Full transparency: idea, structure, data, and recording for this post were provided by the author (Louis Grenier) while the piece was written by our copywriter Lauren.

Net Promoter, Net Promoter System, Net Promoter Score, NPS and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld and Satmetrix Systems, Inc.

Learn why your visitors aren’t converting

Hotjar shows you what keeps your visitors from buying, so you can make website changes based on real insights, not assumptions, and watch your conversion rate grow.

Related articles

User research

5 tips to recruit user research participants that represent the real world

Whether you’re running focus groups for your pricing strategy or conducting usability testing for a new product, user interviews are one of the most effective research methods to get the needle-moving insights you need. But to discover meaningful data that helps you reach your goals, you need to connect with high-quality participants. This article shares five tips to help you optimize your recruiting efforts and find the right people for any type of research study.

Hotjar team

User research

How to instantly transcribe user interviews—and swiftly unlock actionable insights

After the thrill of a successful user interview, the chore of transcribing dialogue can feel like the ultimate anticlimax. Putting spoken words in writing takes several precious hours—time better invested in sharing your findings with your team or boss.

But the fact remains: you need a clear and accurate user interview transcript to analyze and report data effectively. Enter automatic transcription. This process instantly transcribes recorded dialogue in real time without human help. It ensures data integrity (and preserves your sanity), enabling you to unlock valuable insights in your research.

Shadz Loresco

User research

An 8-step guide to conducting empathetic (and insightful) customer interviews

Customer interviews uncover your ideal users’ challenges and needs in their own words, providing in-depth customer experience insights that inform product development, new features, and decision-making. But to get the most out of your interviews, you need to approach them with empathy. This article explains how to conduct accessible, inclusive, and—above all—insightful interviews to create a smooth (and enjoyable!) process for you and your participants.

Hotjar team