User research Remote teams

How our team uses ongoing feedback to design and build a customer-centric product

graphic representing Hotjar teamwork
hotjar snippet installation

...except, for a while, the button was not working correctly, and nobody at Hotjar knew about it. The product and design team eventually found the issue while reviewing customer feedback and seeing a few complaints.

Openly inviting all our users and customers to share their feedback is how we keep building a product that works. But even the best feedback is wasted without a process for reviewing, measuring, and actioning it. 

In this article, I take you behind the scenes of the Hotjar product and design team to show you two tools they use to collect feedback and the steps they take to investigate and review it—so you can replicate the process for yourself straight away.

But first:

What is customer feedback for?

Websites and products make people feel things: admiration for the flawless-looking ones, satisfaction from a smooth experience, frustration when things just won’t function, plus a whole host of emotions in between. These reactions are useful indicators of what works, and doesn’t, for your customers and users, but you won’t find out about any of this unless you give people a chance to tell you.

Enter customer feedback, which allows you to build empathy by having a continuous conversation with your users. To our Product Designer Daniel, collecting and reviewing feedback on an ongoing basis is a crucial aspect of the job:


Now, let's dive deeper into the process.

Step 1: choosing the most appropriate tools for the task

On any given day, our 6-person product and design team collects feedback from several sources to get a clear understanding of what to fix, update, and scale as they keep building Hotjar. 

For this blog post, I'm focusing on two specific website feedback tools the team has recently started using together: the Incoming Feedback widget and the Net Promoter Score ® survey. The first allows any Hotjar user to leave in-the-moment feedback about the product; the second only collects answers from people after they’ve been with us for 30 days.

This is how one of our Product Designers, Craig, explains it:     


A) Incoming Feedback for in-the-moment feedback 

Incoming Feedback lets people share their love-to-hate feelings about a page or page element, plus take screenshots and leave comments about it:

On-page feedback widgets like Incoming Feedback help you collect in-the-moment feedback from your visitors

This is a typical example of the feedback that gets collected through the tool:

hotjar incoming feedback neutral

Our product managers and designers use Incoming Feedback because:

  • It’s a fast, friction-free way for our customers to share their knee-jerk reaction to something they’re experiencing in the moment
  • It gives the team a useful mix of qualitative and quantitative data points: the comments can be read individually to keep ‘the pulse’ on the moment, and the overall scores get aggregated and can be visualized as trends over time  

[sidenote: Incoming Feedback also works well outside of a product and design team, for example if you are a content marketer and want feedback on your content.]

B) Net Promoter Score ® feedback from 30-day old accounts 

We use an NPS survey to ask customers to rate their likelihood of recommending Hotjar to a friend or colleague on a scale from 0 to 10; we also include a follow-up question to understand what else we can do to improve Hotjar:

Hotjar NPS poll question with a response of 9 to 10 (Promoters)

Our product managers and designers use the NPS survey because:

  • It asks customers to contemplate their overall experience with our service on a longer basis (30 days)
  • The answers help the team see trends over the course of weeks, months, and even years, and keep a pulse on the overall experience of using Hotjar 

Why Incoming Feedback + NPS survey together?
For a Product Manager, it is really important to spot small but substantial opportunities for improvement. When used together, Incoming Feedback and an NPS software tool are a powerful combo that helps our team do exactly that:   


Reviewing feedback from different sources at once also adds context and clarity to the feedback itself. For example, a few people have previously used Incoming Feedback to submit a ‘hate’ emotion about our registration page without explaining why:

negative incoming feedback

This NPS of 0 (...ouch) from another user sheds light on the problem, making it easier to decipher. It turns out our password criteria are too stringent and might end up annoying people:

nps answer

...we’ll get that fixed asap.


After the tools are set up, and feedback is being collected, we're ready for the next step: it’s review time.

Step 2: running a monthly feedback investigation

Every month, a member of the product and design team gets nominated as an investigator and spends a full morning going through Incoming Feedback and the NPS survey results, looking at data collected over the last 30 days.

The goal is aggregating feedback, spotting patterns in what people are saying, and getting ready to present any relevant findings to the entire product and design team.

Here are the actions they take, in order:

  1. Begin the investigation from the negative feebdack in Incoming Feedback
    The investigator begins with the negative feedback in Incoming Feedback by filtering all available results from the last 30 days by ‘hate’. The investigator reads the comments one by one and adds notes about re-occurring issues, bugs, broken elements, and even comments that don’t have a clear explanation yet to a dedicated Trello board (more on this below).

Typical examples of ‘hate’ feedback:
- “I can't recover my password. I don't receive the email"
- “Not allowing (me) to save the current survey and create something else...disappointing”
- "I cannot sign up and it doesn't tell me why"

hotjar incoming feedback negative

2. Continue with neutral feedback in Incoming Feedback
Once all the ‘hate’ feedback is reviewed, the task continues with ‘dislike’ and ‘neutral’, looking for evidence of subtle changes that could be made to improve people’s experience in a positive direction.

Typical examples of ‘dislike’ and ‘neutral’ feedback:
- “No Español Language”
- “I can't log in right now. There is a javascript error in the console.”
- "For the last few weeks the preview for polls hasn't been displaying for me on Chrome"

hotjar neutral incoming feedback

3. Finish the Incoming Feedback investigation with the positive feedback 
After uncovering obvious and urgent pain points, the investigator looks through the positive ‘happy’ and ‘love’ comments to get an understanding of where the team is succeeding. We don’t want to have people say “we love this aspect of the product” and then go ahead and change it to the opposite ;)

Typical examples of ‘happy’ and ‘love’ feedback:
-  “Thanks a bunch for making the heatmaps shareable!”
- “easy setup, great experience”
- "so far so great - love the UI - very intuitive"

hotjar incoming feedback positive

4. Move on to NPS, and start again from the detractor/passive feedback 
When the Incoming Feedback investigation is over, the process is repeated for the NPS survey results. Reviewing low scores first (detractor scores between 0 and 6, and passive scores between 7 and 8) lets the investigator see if any new product- or functionality- related issues have appeared in the past 30 days, and/or existing trends keep being reinforced.

Typical examples of detractor feedback:  
- "Please offer translations for other languages"
- "I'd like (more) insights into action steps I could take based on heatmap results. Targeted advice would be really helpful."
- "less complicated password rules. Too complex to follow your password rules. Let me enter what I want"

nps answer

5. Complete the investigation with NPS promoters
Finally, the investigator looks at the NPS promoters who gave a score of 9 or 10, again trying to spot positive trends or opportunities to ‘wow’ our users. This is an excellent way to round up the investigation—after going through all the negative feedback, our team likes to finish on a happy note! 

Typical examples of promoter feedback:
- "The insights into user experience provided by the recordings and heatmaps have helped us improve the layout of our website"
- "We get so much invaluable data from HotJar, and being able to solicit user feedback from in-app surveys without engineering effort is crucial."
- "This tool closes deals by itself!"

nps asnwers score 10

Step 3: running a monthly feedback review meeting

After the investigation is complete, the entire product and design team has a meeting to go through the feedback. This is them, getting ready (I should have probably mentioned we are a 100% remote team):

hotjar design team

They all work together from this 7-column Trello board that contains the following headings (from left to right): Ice Box, Issues in NPS/IF, To Do, Won't do, Themes, Random Thoughts, Actioned.

if nps investigation trello

This is how the session unfolds:

  1. Prior to the meeting, the investigator adds cards to the ‘Issues in NPS/IF’ column.
    When the meeting starts, the investigator opens the discussion by providing some context to the first card in the column.

  2. The card gets moved to the ‘To do’ lists if the team agrees that it needs to be actioned; the list will eventually include both long-term actions like ‘move this to the next planning session’ and urgent tasks like ‘create a bug fix immediately after this call’.
    An owner gets assigned to the card and becomes responsible for its execution.

  3. Cards that are important but not an immediate priority get added to the ‘Ice box’ list and temporarily frozen as the name suggests.
    A good example: in June 2018, somebody asked us to ‘adjust notification frequency to allow for real-time notifications’: a great idea for future release, but not something the team can work on right now.
nps if investigation trello board headers 1

4. When cards present ideas that are out of scope or even go against existing regulation (say, a feature that is not GDPR-compliant) they get moved to the ‘Won’t do’ list.

5. When high-level themes and broader trends become apparent, they get added to the ‘Themes’ column for future review; similarly, new ideas and crazy schemes that inevitably pop up during the discussion get added to the ‘Random thoughts’ list.

6. At the end of the session, the team agrees on a list of actionable items they will add to their backlog. Once completed, these cards will be moved to the ‘Actioned’ column, which works as an archive of completed work.

nps if investigation trello board headers 2

Now, it’s your turn

If you haven't done this already, the first quick win for you is to set up Incoming Feedback on your website (it’s free) to start collecting customer feedback right now.

Once you have it up and running, follow the process I detailed above:

→ have someone from the team volunteer to run an investigation (remember to start from the ‘hate’ feedback and work your way up to the ‘love’)

→ make sure the investigator adds their findings to a Trello board (or a spreadsheet, slide deck, or whatever you prefer to use)

→ set up a team-wide meeting

→ discuss your findings and agree what items to action

→ repeat on a regular basis.

When you get the hang of it, pair Incoming Feedback with an NPS survey so you can deepen the investigation by matching the results together.

Start collecting customer feedback today

Set up Incoming Feedback (it's free!) to see what people love and hate about your website, spot issues, and find new opportunities for growth.

poll customer onboarding

PS: if you have any questions, reach out in the comments and one of our product designers will get in touch :)

Net Promoter, Net Promoter System, Net Promoter Score, NPS and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld and Satmetrix Systems, Inc.

Learn something new every month:
sign up to receive Hotjar content in your inbox.

Related content

Heatmaps, Recordings, Incoming Feedback, Surveys

Try Hotjar. It's free