Learn / Blog / Article

Back to blog

How our team uses ongoing feedback to design and build a customer-centric product

See the ‘copy to clipboard’ button below?  It helps thousands of new users a month copy the Hotjar code snippet and get started with our product.

Last updated

18 Aug 2022

Reading time

9 min

Share

...except, for a while, the button was not working correctly, and nobody at Hotjar knew about it. The product and design team eventually found the issue while reviewing customer feedback and seeing a few complaints.

Openly inviting all our users and customers to share their feedback is how we keep building a product that works. But even the best feedback is wasted without a process for reviewing, measuring, and actioning it. 

In this article, I take you behind the scenes of the Hotjar product and design team to show you two tools they use to collect feedback and the steps they take to investigate and review it—so you can replicate the process for yourself straight away.

But first:

What is customer feedback for?

Websites and products make people feel things: admiration for the flawless-looking ones, satisfaction from a smooth experience, frustration when things just won’t function, plus a whole host of emotions in between. These reactions are useful indicators of what works, and doesn’t, for your customers and users, but you won’t find out about any of this unless you give people a chance to tell you.

Enter customer feedback, which allows you to build empathy by having a continuous conversation with your users. To our Product Designer Daniel, collecting and reviewing feedback on an ongoing basis is a crucial aspect of the job:

We don’t just throw stuff at people: we want to be in a conversation with them. So we put a design or a feature out there and that’s our conversation starter, but then we need to hear back to close the feedback loop. And then, based on what comes back, we need to speak again. And then another thing comes back, and we keep going. This continuous back and forth is what keeps the relationship going, and helps grow both the product and the business.

Daniel Parascandolo
Product Designer

Now, let's dive deeper into the process.

Step 1: choosing the most appropriate tools for the task

On any given day, our 6-person product and design team collects feedback from several sources to get a clear understanding of what to fix, update, and scale as they keep building Hotjar. 

For this blog post, I'm focusing on two specific website feedback tools the team has recently started using together: the Incoming Feedback widget and the Net Promoter Score ® survey. The first allows any Hotjar user to leave in-the-moment feedback about the product; the second only collects answers from people after they’ve been with us for 30 days.

This is how one of our Product Designers, Craig, explains it:

We're looking at this feedback from a product perspective: what bugs or issues do we need to fix? What changes can we make to the interface to alleviate some of the problems users are having? What can we do to improve the sentiment and experience they're having with the product?

Craig Johnson
Product Designer

A) Incoming Feedback for in-the-moment feedback 

Incoming Feedback lets people share their love-to-hate feelings about a page or page element, plus take screenshots and leave comments about it:

This is a typical example of the feedback that gets collected through the tool:

Our product managers and designers use Incoming Feedback because:

  • It’s a fast, friction-free way for our customers to share their knee-jerk reaction to something they’re experiencing in the moment

  • It gives the team a useful mix of qualitative and quantitative data points: the comments can be read individually to keep ‘the pulse’ on the moment, and the overall scores get aggregated and can be visualized as trends over time  

[sidenote: Incoming Feedback also works well outside of a product and design team, for example if you are a content marketer and want feedback on your content.]

B) Net Promoter Score ® feedback from 30-day old accounts 

We use an NPS survey to ask customers to rate their likelihood of recommending Hotjar to a friend or colleague on a scale from 0 to 10; we also include a follow-up question to understand what else we can do to improve Hotjar:

Our product managers and designers use the NPS survey because:

  • It asks customers to contemplate their overall experience with our service on a longer basis (30 days)

  • The answers help the team see trends over the course of weeks, months, and even years, and keep a pulse on the overall experience of using Hotjar 

Why Incoming Feedback + NPS survey together? For a Product Manager, it is really important to spot small but substantial opportunities for improvement. When used together, Incoming Feedback and an NPS software tool are a powerful combo that helps our team do exactly that:   

Using the Net Promoter System and Incoming Feedback together gives us a better feel for little things we can incrementally improve, which people won't necessarily submit as feature requests. Smaller observations like “this is bugging me” or “this hasn’t been working” usually don’t get reported, but they do come through in both types of feedback. So we look at them together to get a sense-check on how people are feeling and what's making them feel like that, in a way we couldn’t get with feature requests or bigger issues.

Sam Bower
Product Manager

Reviewing feedback from different sources at once also adds context and clarity to the feedback itself. For example, a few people have previously used Incoming Feedback to submit a ‘hate’ emotion about our registration page without explaining why:

This NPS of 0 (...ouch) from another user sheds light on the problem, making it easier to decipher. It turns out our password criteria are too stringent and might end up annoying people:

...we’ll get that fixed asap.

***

After the tools are set up, and feedback is being collected, we're ready for the next step: it’s review time.

Step 2: running a monthly feedback investigation

Every month, a member of the product and design team gets nominated as an investigator and spends a full morning going through Incoming Feedback and the NPS survey results, looking at data collected over the last 30 days.

The goal is aggregating feedback, spotting patterns in what people are saying, and getting ready to present any relevant findings to the entire product and design team.

Here are the actions they take, in order:

  1. Begin the investigation from the negative feebdack in Incoming Feedback

    The investigator begins with the negative feedback in Incoming Feedback by filtering all available results from the last 30 days by ‘hate’. The investigator reads the comments one by one and adds notes about re-occurring issues, bugs, broken elements, and even comments that don’t have a clear explanation yet to a dedicated Trello board (more on this below).

Typical examples of ‘hate’ feedback: - “I can't recover my password. I don't receive the email" - “Not allowing (me) to save the current survey and create something else...disappointing” - "I cannot sign up and it doesn't tell me why"

2. Continue with neutral feedback in Incoming Feedback Once all the ‘hate’ feedback is reviewed, the task continues with ‘dislike’ and ‘neutral’, looking for evidence of subtle changes that could be made to improve people’s experience in a positive direction.

Typical examples of ‘dislike’ and ‘neutral’ feedback: - “No Español Language” - “I can't log in right now. There is a javascript error in the console.” - "For the last few weeks the preview for polls hasn't been displaying for me on Chrome"

3. Finish the Incoming Feedback investigation with the positive feedback  After uncovering obvious and urgent pain points, the investigator looks through the positive ‘happy’ and ‘love’ comments to get an understanding of where the team is succeeding. We don’t want to have people say “we love this aspect of the product” and then go ahead and change it to the opposite ;)

Typical examples of ‘happy’ and ‘love’ feedback: -  “Thanks a bunch for making the heatmaps shareable!” - “easy setup, great experience” - "so far so great - love the UI - very intuitive"

4. Move on to NPS, and start again from the detractor/passive feedback  When the Incoming Feedback investigation is over, the process is repeated for the NPS survey results. Reviewing low scores first (detractor scores between 0 and 6, and passive scores between 7 and 8) lets the investigator see if any new product- or functionality- related issues have appeared in the past 30 days, and/or existing trends keep being reinforced.

Typical examples of detractor feedback:   - "Please offer translations for other languages" - "I'd like (more) insights into action steps I could take based on heatmap results. Targeted advice would be really helpful." - "less complicated password rules. Too complex to follow your password rules. Let me enter what I want"

5. Complete the investigation with NPS promoters Finally, the investigator looks at the NPS promoters who gave a score of 9 or 10, again trying to spot positive trends or opportunities to ‘wow’ our users. This is an excellent way to round up the investigation—after going through all the negative feedback, our team likes to finish on a happy note! 

Typical examples of promoter feedback: - "The insights into user experience provided by the recordings and heatmaps have helped us improve the layout of our website" - "We get so much invaluable data from HotJar, and being able to solicit user feedback from in-app surveys without engineering effort is crucial." - "This tool closes deals by itself!"

📅  A note about frequency: the monthly structure allows our investigator to review hundreds of feedback points in half a day’s work. If we were to find ourselves with thousands+ comments, we would consider A) running investigations every two weeks and/or B) breaking up the feedback into manageable chunks (for example, approaching it with a particular problem to solve or a thesis to prove/disprove, and only looking for pertinent information).

Step 3: running a monthly feedback review meeting

After the investigation is complete, the entire product and design team has a meeting to go through the feedback. This is them, getting ready (I should have probably mentioned we are a 100% remote team):

They all work together from this 7-column Trello board that contains the following headings (from left to right): Ice Box, Issues in NPS/IF, To Do, Won't do, Themes, Random Thoughts, Actioned.

This is how the session unfolds:

  1. Prior to the meeting, the investigator adds cards to the ‘Issues in NPS/IF’ column. When the meeting starts, the investigator opens the discussion by providing some context to the first card in the column.

  2. The card gets moved to the ‘To do’ lists if the team agrees that it needs to be actioned; the list will eventually include both long-term actions like ‘move this to the next planning session’ and urgent tasks like ‘create a bug fix immediately after this call’. An owner gets assigned to the card and becomes responsible for its execution.

  3. Cards that are important but not an immediate priority get added to the ‘Ice box’ list and temporarily frozen as the name suggests. A good example: in June 2018, somebody asked us to ‘adjust notification frequency to allow for real-time notifications’: a great idea for future release, but not something the team can work on right now.

4. When cards present ideas that are out of scope or even go against existing regulation (say, a feature that is not GDPR-compliant) they get moved to the ‘Won’t do’ list.

5. When high-level themes and broader trends become apparent, they get added to the ‘Themes’ column for future review; similarly, new ideas and crazy schemes that inevitably pop up during the discussion get added to the ‘Random thoughts’ list. 6. At the end of the session, the team agrees on a list of actionable items they will add to their backlog. Once completed, these cards will be moved to the ‘Actioned’ column, which works as an archive of completed work.

🏆 Pro tip: whenever we review feedback, we know not to get carried away and action it all; we do not launch into developing a new feature every time a few people ask us to. It’s perfectly ok to use the ‘icebox’ system to stay focused on our roadmap priorities—but we do keep a record of the feedback, so we can revisit in the future if/when needed.

Now, it’s your turn

If you haven't done this already, the first quick win for you is to set up Incoming Feedback on your website (it’s free) to start collecting customer feedback right now.

Once you have it up and running, follow the process I detailed above:

→ have someone from the team volunteer to run an investigation (remember to start from the ‘hate’ feedback and work your way up to the ‘love’)

→ make sure the investigator adds their findings to a Trello board (or a spreadsheet, slide deck, or whatever you prefer to use)

→ set up a team-wide meeting

→ discuss your findings and agree what items to action

→ repeat on a regular basis.

When you get the hang of it, pair Incoming Feedback with an NPS survey so you can deepen the investigation by matching the results together.

Start collecting user behavior insights today 🔥

Grab a free Hotjar trial and find out the drivers/barriers/hooks behind your users' behavior.

Related articles

User research

5 tips to recruit user research participants that represent the real world

Whether you’re running focus groups for your pricing strategy or conducting usability testing for a new product, user interviews are one of the most effective research methods to get the needle-moving insights you need. But to discover meaningful data that helps you reach your goals, you need to connect with high-quality participants. This article shares five tips to help you optimize your recruiting efforts and find the right people for any type of research study.

Hotjar team

User research

How to instantly transcribe user interviews—and swiftly unlock actionable insights

After the thrill of a successful user interview, the chore of transcribing dialogue can feel like the ultimate anticlimax. Putting spoken words in writing takes several precious hours—time better invested in sharing your findings with your team or boss.

But the fact remains: you need a clear and accurate user interview transcript to analyze and report data effectively. Enter automatic transcription. This process instantly transcribes recorded dialogue in real time without human help. It ensures data integrity (and preserves your sanity), enabling you to unlock valuable insights in your research.

Shadz Loresco

User research

An 8-step guide to conducting empathetic (and insightful) customer interviews

Customer interviews uncover your ideal users’ challenges and needs in their own words, providing in-depth customer experience insights that inform product development, new features, and decision-making. But to get the most out of your interviews, you need to approach them with empathy. This article explains how to conduct accessible, inclusive, and—above all—insightful interviews to create a smooth (and enjoyable!) process for you and your participants.

Hotjar team