Last updated Oct 20 2020

⚡ Ecommerce surveys

When your web analytics doesn’t tell the whole story - Spencer Wong

Spencer Wong, Digital Experience at MADE, explains how he used on-site surveys to go beyond MADE’s web analytics and uncover the real reason behind a low-converting product.

What Spencer covers:

  • Why web analytics data can be misleading on its own
  • How surveys complement web analytics to give you greater insight
  • Why you should collect data from both prospective and current customers

Click below to read the transcript.

[Transcript]

Hi everybody, my name is Spencer Wong, I am Head of Digital Experience at MADE.COM. MADE.COM is an online designer furniture brand. We're based here in London and operating across the UK and Europe.

My lightning talk today will go through a very specific example but hopefully, some of the learnings are applicable more broadly. A common challenge we have is analyzing existing website functionality. This can be something that was implemented before you joined so you don't know kind of what research and stuff was done at the time, but it's very common to ask for.

In my case, I was asked how is Click & Collect performing across the UK and France. Click & Collect is the small items that you can get delivered to a Click & Collect network, (not our showrooms, but the Click & Collect network where you can pick it up from the shops and stuff after work and things like that). So, it’s convenient when not everybody's locked down. And then also should we offer it in other markets? I was looking at that as well.

The first way I looked at this was to dive into websites analytics and other data insight. I looked at the number of Click & Collect orders we had, looked at the number of products that were eligible, looked at the funnel. This is just some screenshots of the funnel. I looked when we played around with the pricing and making it cheaper and how that impacted uptake.

The main thing here that will follow on to the next screen is just this first part of the funnel, which is just where they're given the option to choose home delivery or Click & Collect. And so when I looked at this and when I presented this, I saw very different behavior between the UK and France. You can see in France, almost half of the users at that particular screen will choose the Click & Collect option. And then basically a much lower percentage in the UK.

When I presented this I said, “this is the website behavior I'm seeing and so on.” Then the feedback from stakeholders meeting was like, “there must be something wrong in the UK. You know, it looks like it's underperforming, is there an issue on the website, is it not offered as much, is there an issue with the number of Click & Collect points?”

I was then bombarded with all these additional questions and I couldn't really answer them at that time. I was asked to go and come back to better understand what's wrong with the UK.

And so, at that point, I was a little bit frustrated, 'cause I've already spent a bunch of time looking at this and you know, presented that this was what I'm seeing. In addition to some of the other analytics that I would have to do and deep dive, I also want to do a poll (a survey), because I wanted to understand if there was just a difference between the markets themselves. And so, I asked a very simple question at the start of the survey: “for a small item, what is your preferred delivery method? Is it getting delivered to your home or flat, Click & Collect or office?” And what was interesting here is then what plays out in the onsite behavior, also is seen here in the survey.

In France, there's just a much higher preference for Click & Collect. And then when I started to talk to my French colleagues, they said, “yeah they have less flexibility to work from home and be home for deliveries and stuff like that.” And so, France actually showed the highest preference of Click & Collect across all these different markets that I surveyed.

The other interesting thing, and what's useful here, is that I can then also project the expected performance to some of these other markets, right? Based on the survey feedback to what I'm seeing onsite, I can project and say what I think the opportunity would be in some of these other markets as well.

Quite useful in that sense.

I did ask follow-on questions to understand why they preferred home delivery or why they preferred Click & Collect. I mean, it's not really that relevant for this lightning talk, but you know, obviously, you can ask a multipart survey.

My quick summary is why I love the onsite survey: it's super easy to set up for one, the speed and volume of responses, so, depending on your website traffic, I had thousands of responses within a couple days, it's much faster than sending up a survey that you send to a part of your email base. That is going to take longer to set up, take longer to get responses as well.

The onsite survey also captures feedback from both prospects as well as existing customers. So people that may have only landed on your site for the first time, they can answer, and so on, compared to the email survey where you're kind of slightly biased to people who are very familiar with your brand already.

It really complements other forms of qualitative feedback. If you're doing user interviews, other kinds of user testing, where you are gathering feedback, it really complements that full picture.

And like I said, understands potential future impact. I used it there to kind of help project some numbers, but also you can use it to evaluate new features. I've used it for evaluating people's interest for us, in like financing options. As well as I’ve done a survey around sustainability and what areas of sustainability are most important to our customers.

I really love it and I encourage you guys to use onsite surveys wherever you can to help gather this kind of broad feedback really quickly.

Thanks.