The one insight that changed my career – David Darmanin
David Darmanin, CEO at Hotjar, shares the one insight that completely changed his career in design and optimization.
What David covers:
- His progression from small testing to truly unlocking impactful changes
- Why not all users are made equal
- The three types of users you should be aware of (Star Wars reference alert)
Click below to read the transcript.
The one insight that changed my career - transcript
Hey everyone. David here very excited to be doing this lightning learning session today, with you all. And yeah, let's kick off.
Today I'm gonna be talking about the one insight that completely changed my career in design and optimization. So if you're hoping that this is going to be something complicated or very sophisticated, you're gonna be disappointed. I'm a big fan of simple, big impact stuff in general.
And this is definitely going to be one of those. But before we start actually talking about that one insight, I do wanna give some background in terms of my evolution and how my career developed. So like many of you, you might identify with this.
In the beginning, When I started doing testing, I found that my focus was mainly around elements, their size, the colors, the size, their position. And later on, I evolved to start focusing more on the layout. The composition of the page, back then multivariate testing was quite popular. And later on, I found that actually focusing on and learning about copy, persuasion and accelerating testing velocity was really a big impact on the wins and my success in my role.
But finally, the big jump for me was really focusing on really understanding the user, their experience, how they're behaving, what they care about, and this truly unlocked much bigger wins for me. I was much more successful because of this.
But within that last stage, there is really one insight, which is really important, right. That we're gonna zero in on.
So, and that insight is not all users are made equal. And I think, some of you might be thinking, but hold on, this is not a big insight, because what I'm gonna share with you is not exactly what you might think, I mean by this.
So here's how I classify users into three groups. So the first I call them the just browsing troopers. The second are the undecided Chewbaccas and finally come the determined Jedis. All right, so these are the three groups. And we're gonna look at what are the differences between these two.
But before we do that, let's start with your sites, right. So your site is receiving all three types of these visitors, right? So they're arriving there, they're interacting with the site and what not. So what happens?
Well the Jedis, they are the most passionate, determined people, they are experienced. They know the product you're selling, or they know you, and they're just gonna buy, no matter what, how difficult it's going to be, they will buy it. And I've seen this in newly launched stock sites or startups or sites that just have a horrible experience. They still convert.
However, the just browsing troopers, they're confused. They have no idea what they're doing. They don't even know where they are. They're very inefficient. They're never going to be convinced to buy from your site. No matter what, they're going to exit, they're going to abandon.
But what's interesting is the Chewbaccas. Why? Because some of them are going to convert and some of them are not going to convert. And this is the most interesting part. So on the top, I wrote, start with your customers. Actually it's about start with your Chewbaccas, right? So why do I say that? Because ultimately it's very natural to focus on.
Okay, how do I get these storm troopers and Chewbaccas that are exiting to buy? Whereas actually what we wanna focus on is the Chewbaccas that are actually converting. Because if we speak to them, we're going to get from them, will be called qualified objections. Now, why do I say this?
Because if we look at the group of exiting people, there's a lot of Storm troopers and not a lot of Chewbaccas. It's just the nature of the traffic volumes that we experience on our site. So if we ask them, we're gonna hear more from the Storm troopers, and there's no way of distinguishing between the two. Whereas if we speak to our customers, there is more Chewbaccas than there are Jedis. So we're gonna hear more from them. Plus if we ask our Jedis, what stopped them or what were their objections? They're likely to say nothing, it was easy.
So here's a quick example of what you can do, post conversion with your customers in the moment. Ask them a really simple, easy question, which involves just one click. It can be yes or a no. Was that easy or not. How would you rate our experience. Maybe an NPS survey. Then follow up with a question that is targeted based on that previous example. If it's a negative response, oh, no, show empathy. If it was a good one, what did you like? And finally, what nearly stopped you from becoming a customer?
And by asking that question, this is where you're going to get the Chewbaccas, the undecided Chewbaccas, giving you that feedback.
So that's what I had for you today. Hope this was valuable. I'm looking forward to hearing from you.
Why your customer satisfaction surveys are a lie - Els Aerts
Els Aert from AGConsult shares tips and examples of questions you should or should not ask if you want to get valid survey results.
What Els covers:
- Questions that ask people to predict the future (when they obviously can't)
- Leading questions that may influence survey results
- Why questions biased towards the negative are useful
- Why open-ended questions are always useful
Click below to read the transcript.
Why your customer satisfaction surveys are a lie - transcript
Hi, I'm Els from AGConsult and I'm here to tell you why your customer satisfaction surveys are a lie, and what you can do about it. In four easy steps.
Number one, don't ask people to predict the future. I mean, that's pretty much asking to be lied to, right?
The classic NPS question does exactly that. It asks people to predict their own future behavior. Now, how likely are you to recommend our product, our company to a friend or colleague? Don't do that. I hope I'm not telling you anything new when I say nobody can predict the future. So simply don't ask people.
Number two, don't ask leading questions.
Questions that start off all positive like, how happy are you or how good was? trick people into answering in a more positive way.
Sure, It'll get your customer satisfaction rates up. But it's not really the truth. So, instead of asking how happy or how good, counter that positive adjective with its negative counterpart. So always ask how happy or how unhappy were you? How good or bad? How fast or slow? So that you balance out that positivity with negativity and you get neutral input.
Tip number three, only for the brave, bias towards the negative.
Now, I know I just told you not to ask leading questions and to always balance out your positive adjective, how happy you were, with a negative adjective, how unhappy you were. But when you're brave, I would ask you to bias your questions towards the negative. So not to ask how easy or how easy or hard but to simply ask, how hard was it? How hard was it to complete your reservation?
'Cause that way, you put the user into the mindset of the problems that they experienced earlier. It'll be easier for them to recall the points where they struggled, which means, they will be able to give you a much more useful feedback in the open follow up question.
Finally, tip number four. Always, always ask an open question as follow up.
Rating scale questions are well, basically, only good for comparison purposes, preferably, when you compare it with your own data from past surveys, or to create pretty charts for management. Because really, what does a seven out of 10 tell you? And how is it different from an eight out of 10? It doesn't tell you anything you need to improve.
If you truly want to create a better user experience, then you have to ask open questions. You had to get real input from your users. Preferably you tailor the open question to the answer given in the rating scale question. Like for example, if a user scored you a three out of 10, that's not good, right? Acknowledge that, say you're sorry and ask them what the problem was. That way you acknowledge the users problem and you gain the input you need to fix it.
If somebody rates you an eight out of 10, say thank you and ask them what you could do to improve to make that a nine or even a 10. Now, if somebody writes you at 10, crack open the champagne and ask them, what was the thing that made it so memorable for them? That way, you get input on lots of different levels. You know what to really fix, what your problems are. You find out the improvement zones and you get a grasp on what truly makes your customers happy and what you should be doing more of. That way you can truly find out everything you need to know to make your customers happy.
Thank you very much.