Jeff Toister — The Service Culture Guide

View Original

What is a Good Survey Response Rate?

It's the most common question I get about surveys.

Customer service leaders are understandably concerned about getting a lot of voice of customer feedback. So my clients want to know, "What is a good response rate for our customer service survey?" 

The answer may surprise you—there's no standard number. 

There are situations where an 80 percent response rate might be bad while a 5 percent response rate might be phenomenal in other circumstances.

In fact, I'm not overly concerned with the percentage of people who respond. My advice to clients is to use a different set of criteria for judging their survey responses.

Here's how to evaluate your own survey response rate the same way I do.

Three Response Rate Criteria

There are three criteria that you can use to determine if you're getting a good response to a customer service survey:

  • Usefulness

  • Representation

  • Reliability

Usefulness is the most important consideration.

Any response rate that provides useful customer feedback is good. That's not to say you can't do even better than your current rate, but the whole purpose of a customer service survey should be to yield useful data.

For example, let's say you implement a contact opt-in feature that allows you to follow-up with customers who leave negative feedback. That survey could become tremendously useful if it allows you to contact angry customers, fix problems, and reduce churn.

Representation is another important way to gauge your response rate.

You want your survey to represent all of the customers you are trying to get feedback from. Imagine you implement a new self-help feature on your website. A representative survey in this case would ask for feedback from customers who successfully used self-help as well as customers who weren't successful and had to try another channel.

Sometimes you need to augment your survey with other data sources to make it more representative. The authors of The Effortless Experience discuss the self-help scenario in their book and suggest having live agents ask customers if they first tried using self-help.

This question can help identify people who didn't realize self-help was available and therefore wouldn't complete a survey on its effectiveness. It could also capture feedback from people who tried self-help, were unsuccessful, and didn't notice a survey invitation because their priority was contacting a live agent to solve the problem.

My final criterion is reliability.

This means the survey can be relied upon to provide consistently accurate results. Here's a summary of considerations from a recent post on five characteristics of a powerful survey.

  1. Purpose. Have a clear reason for offering your survey.

  2. Format. Choose a format (CSAT, NPS, etc.) that matches your purpose.

  3. Questions. Avoid misleading questions.

Many surveys have problems in one or more of these areas. For instance, a 2016 study by Interaction Metrics discovered that 92 percent of surveys offered by the largest U.S. retailers asked leading questions that nudged customers to give a more positive answer.

For example, Ace Hardware had this question on its survey:

How satisfied were you with the speed of your checkout?

The problem with a question like this is it assumes the customer was satisfied. This assumptive wording makes a positive answer more likely.

A more neutral question might ask, "How would you rate the speed of your checkout?"

 

Resources

A survey response rate is good if it generates useful data, is representative of the customer base you want feedback from, and is reliable.

That doesn't mean you shouldn't strive to continuously improve your survey. Here are some resources to help you: