How to Increase Survey Response Rates by 370%

Andrew Gilliam, ITS Service Desk Consultant

Andrew Gilliam, ITS Service Desk Consultant

Small changes can often lead to big results.

Andrew Gilliam is an ITS Service Desk Consultant at Western Kentucky University. He improved the response rate to customer service surveys by 370 percent simply by changing the wording of the survey invitation email.

I interviewed Gilliam to learn about how he was able to do it. He provides a lot of helpful, actionable advice into this short, 20 minute interview. 

Topics we cover include:

  • Why you should survey both internal and external customers

  • What constitutes a "good" response rate

  • How to improve your survey invitation email

  • What types of customers typically complete surveys

  • Why you need feedback from angry, happy, and even neutral customers 

You can watch the full interview here. Follow Gilliam on Twitter at @ndytg or contact him via his website.


I Took Every Survey For a Week. The Results Weren't Good.

Customers are inundated with surveys.

We get them on receipts, via email, and in the mail. Shop somewhere and you're asked to take a survey. Don't shop somewhere, and a survey still appears. Visit a website and ping!, you get asked to take a survey.

I decided to take a week and do a small experiment. During that week, I would take every single survey I was asked to complete. The idea was to test three things:

  1. How many surveys would I be offered?

  2. Were any of the surveys well-designed?

  3. What was the experience like?

I was asked to complete 10 surveys during the week. That pencils out to over 500 surveys per year! No wonder customers experience survey fatigue.

Only one of the 10 surveys was well-designed. Every other survey had at least one glaring flaw, and most had multiple failures. More on that in a moment.

And what was my experience like? Most of the surveys backfired. The experience was so poor it made me like the company even less.

Person filling out a customer service survey to report a negative experience.

Surveys Are Too Difficult

When you ask a customer to take a survey, you're really asking the customer to do you a favor. A lot of the surveys I took made that favor really difficult.

Just accessing the surveys was a big challenge. 

My first survey request was on a receipt from the post office. The receipt had a QR code that I was able to quickly scan with my phone, but then the survey site itself was not optimized for mobile phones.

A survey from Dropbox wanted me to first read and acknowledge a confidentiality agreement before completing its survey.

Confidentiality agreement required to take the Dropbox survey.

The super odd thing was the confidentiality agreement had it's own survey! This extra bit of aggravation got even more annoying when the survey required me to fill out the comments box to explain my rating of the confidentiality agreement.

Survey requiring a comment.

Back to the first Dropbox survey, I had been working on it for 11 minutes in when I hit an infinite loop. None of the answers to a question applied to me, and it lacked a “Not Applicable” option for this required question. I felt I had put in enough time at that point and just gave up.

The survey invitation from Vons, my local grocery store, was a real piece of work. It was a receipt invitation, but there was no QR code, so I had to manually enter the web address. Then I had to enter a string of numbers along with my email address!

Vons survey invitation page, which requires an email address.

I couldn't complete two surveys due to errors. An email invitation from Chewy linked to a web page that I couldn't get to load. The Human Resources Certification Institute sent me a survey on May 24 that closed on May 23. Completing that survey is pretty low on the list of things I would do if I had access to a time machine.

Poor Survey Design

Beyond being difficult, just one of the ten surveys was designed well enough to provide useful, actionable, and unbiased information.

Many surveys were too long, which often triggers low completion rates. The Dropbox survey advertised it would take 15 minutes. (Who has that kind of time?!) These companies' surveys could easily be redesigned to get better data and higher completion rates from just three questions.

Many were full of leading questions designed to boost scores. This AutoZone survey arranged the rating scale with the positive response first, which is a subtle way to boost ratings. Like many of the surveys I took, there wasn't an option to leave comments and explain why I gave the ratings I did.

AutoZone customer service survey.

The survey from Vons was an odd choose your own adventure survey, where I got to decide which topic(s) I wanted to be surveyed on. 

Screenshot of multi-part customer service survey from Vons.

This created unnecessary friction and generated a little confusion since my biggest gripe on that particular visit was the large number of aisles blocked off by people stocking shelves. Is that a store issue, an employee issue, or a product issue? It’s a great example of where asking a customer to simply give a rating and then explain the rating would quickly get to the core of my dissatisfaction.

The One Good Example

The best survey was a Net Promoter Score (NPS) survey from Suunto. 

I received this survey invitation about six months after I registered a new watch on the Suunto website. NPS surveys measure a customer's intent to recommend, so giving me six months to use the watch before asking if I'd recommend it allows enough time for me to know what I like and don't like about the product.

Another positive was it asked just two questions: a rating and a comment. 

Suunto NPS survey.

Short surveys tend to have much higher completion rates than longer ones. Counterintuitively, you can almost always get more useful data from a short survey than a long and tedious survey. (More on that here.)

My question about the Suunto survey was whether the survey was linked to my contact information. This is necessary so someone from Suunto can follow-up with unhappy customers to learn more about the issues they're experiencing. (More on that here.)

Resources to Create Better Surveys

Here are some resources to help you avoid these mistakes and create better surveys.

You can also get step-by-step instructions for creating a great survey program by taking my customer service survey course on LinkedIn Learning.


Report: Why Retail Customer Service is Dropping

A new report from the American Customer Satisfaction Index shows a drop in retail customer satisfaction. From department stores like Nordstrom to specialty stores like Bed Bath & Beyond, customers are less happy than they were a year ago.

How can this be possible in an era where customers are bombarded with survey requests and access to big data is at an all-time high?

The answers have to do with people. How people are staffed, managed, and the duties they are asked to perform all have an impact on customer satisfaction.

You can access the full report or read below to see the highlights and analysis. To kick things off, the chart below shows a comparison in overall satisfaction between 2017 and 2018 on a 100-point scale:

Retail customer satisfaction declined from 2017 to 2018.

Retail customer satisfaction declined from 2017 to 2018.

Trend #1: Courtesy and Helpfulness of Staff

This one is down across the board.

Courtesy and helpfulness from retail employees has declined.

Courtesy and helpfulness from retail employees has declined.

Staffing levels have a big impact on this category. Retailers routinely understaff stores in an effort to save money, but this leaves the few available employees running ragged trying to serve multiple customers and complete tasks like restocking and merchandising.

Another issue is the surveys that seemingly appear on every retail receipt. These should help retailers detect problems like unfriendly employees. But the dirty secret is many retailers don't actually use those surveys to improve. And many even manipulate the surveys to make the scores look better than they really are.

A 2016 report from Interaction Metrics found that 68 percent of retail customer satisfaction surveys were "total garbage."


Trend #2: Layout and Cleanliness of Store

There's a slight dip in this area.

Stores need to improve the cleanliness and layout.

Stores need to improve the cleanliness and layout.

Part of the challenge is staffing (see Trend #1). Stores struggle to stay clean and organized when there aren't enough employees to do the work.

Another is command structure. Many retail chains make store layout decisions at the corporate level, and don't do enough field testing to ensure the designs actually make sense. Last year, I did a comparison of my local Walgreens, Rite Aid, and CVS and noted important differences in the layout of each store.


Trend #3: Speed of Checkout Process

The checkout process was another area where satisfaction dropped across the board.

Checking out is too slow at retail stores.

Checking out is too slow at retail stores.

Here again staffing plays a role. We've probably all wasted time wandering around a department store, searching for someone to ring us up. And that's precisely why so many people would rather shop online—it's much easier.

Customer satisfaction with speed isn't just about the actual amount of time it takes. People are heavily influenced by perception. So a pleasant experience with a friendly cashier that takes five minutes will feel like a breeze, while an unpleasant experience that also takes five minutes will feel like an eternity.

Retailers could help themselves by studying these factors that influence wait time perception.

Take Action

There are three easy ways retailers can check these trends in their own stores.

Talk to employees. I have no idea why managers don't spend more time doing this. Employees will almost always be forthcoming about the challenges they face if you ask them sincerely.

Walk your stores. Spend time walking through your stores like a customer. You'll often discover unexpected problems that your customers encounter every day.

Use surveys wisely. Customer feedback surveys can be valuable tools, but you should use them wisely or not use them at all. This short video will help you decide why you want to run a survey program.


Why You Need to Analyze Survey Comments

I'm putting the finishing touches on the second edition of my book, Getting Service Right. The book was originally called Service Failure, and I've now updated both the title and some of the research.

The cover is one of the most important sales tools for a book, so I worked with Anne Likes Red to come up with a few designs. I then launched a survey to ask readers for their feedback on three cover options. The survey was up for just a few days and a 135 people responded.

Here were the results:

Option A (28%)

GSR-3.jpg

Option B (52%)

Option C (20%)

Picking cover option B should be a no-brainer, right? After all, more than half of all survey respondents picked that option.

Without qualitative information, I might have made that mistake. Fortunately, I also included a comment field in the survey. When you analyze the comments to learn why someone chose a particular option, a new pattern emerges.


Searching for Themes

I recently hosted a webinar with Alyona Medelyan, CEO of the customer insight firm Thematic. Medelyan brought actual client data to reveal some interesting insights that a survey score alone wouldn’t show:

  • A cable company found customers with modem issues were impacting overall NPS by -2 points.

  • Another company discovered one variable that caused customers to spend $140 more per year.

  • An airline learned passengers were 4x angrier about missed connections than delayed flights.

The point Medelyan made is we usually get deeper, more actionable insights when we analyze the comments and not just the scores. So I applied this concept to my book cover survey and found two significant themes contained in the comments.

The first was quite a few people chose B because they liked the subtitle below the title better than the way it was shown in option A and C. So it wasn't just the color that's drove people to option B.

The second theme was quite a few people who selected option B mentioned they liked the title arrangement of option B, but preferred the color of option A. There were even a handful who picked B but mentioned they liked the color on option C best.

Suddenly option B isn't such a clear and convincing winner. Here's what happened when I revised the survey results to account for color choice alone:

Option A (40%)

Option B (39%)

Option C (21%)

Now I have two insights:

  • People prefer the blue cover shown option A

  • People like the title arrangement in option B

Keep in mind I only made adjustments where respondents were explicit in their survey comments. If someone didn't explain why they chose B, they may have done it for the title arrangement, the color, or pure whimsy.

Making a Final Decision

I did a similar survey with my last two book covers, and both times I ended up choosing elements from different options. I did the same thing this time.

Going with option B's title arrangement was a pretty easy decision. There were numerous comments describing option B as the preference without any support for the layout of options A and C.

I ultimately chose the blue color from option A. 

Several survey comments mentioned color theory, and my friend Jim even shared this helpful resource from Quick Sprout. According to the guide, the color blue symbolizes tranquilty and peace and has more positive associations across various cultures than purple and green.

The kicker is the blue is my personal preference. I really like it, and it's important for an author to really like the cover of their book! Here's the final cover:

It was also important to consider how the cover will look when shown together with my other books on Amazon, in a bookstore, or at a trade show. Here's how it will look displayed next to my other books:

Take Action

You can gain so much more from a survey if you combine the fixed choices (ex: option A, B, or C) with comments. Try analyzing one of your own surveys to see what hidden insight is revealed.

You’ll find a lot of simple analysis techniques in the webinar with Alyona Medelyan from Thematic.

You can also get more help with your survey on this survey resource page.


How to Get Customer Feedback Without a Survey

Advertising disclosure: We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

I frequently use subscriber feedback to improve my Customer Service Tip of the Week email newsletter. Yet I've never used a survey.

Customers are inundated with surveys, so it's important to think carefully before rolling out yet another one. With my newsletter, I've found I can get a lot of useful voice of customer feedback from several alternative sources.

Here are five ways I collect and use voice of customer feedback.

Business people sitting around a conference table analyzing survey data.

Issue Alerts

The weekly email will occasionally have a small issue such as a typo or a broken hyperlink. I try to proofread each email and test all the links, but problems occasionally do happen.

Typos are my kryptonite.

Thankfully, I can count on subscribers to let me know when there is an error. It's usually just a handful of people who email me about the problem, but that's all the feedback I need. Keep in mind most customers won't bother to tell you about small issues, but that doesn't mean they don't notice!

I have a process in place where I can flag a problem and fix it the next time I send out the same tip. In some cases, such as a broken hyperlink, I may re-send the email with the correction, although I try not to do this very often because I don't like swamping people's inboxes with extra emails.

Discussion question: What process do you have in place to allow your frontline agents to resolve or report problems?

 

Investigate Icebergs

A customer service iceberg is an issue that seems small and isolated on the surface, but is actually a sign of a much larger and more dangerous problem that's hidden from view.

Someone recently emailed me to let me know she had tried to sign-up for the Customer Service Tip of the Week email, but never received a confirmation. This was a classic iceberg because it was easy to dismiss the problem as a one-off where maybe she just missed the email or the confirmation wound up in a spam folder. 

I was tempted to just manually subscribe her to my list, but I decided to investigate. 

My research led me to a helpful exchange with a support agent at MailChimp, the company that powers my newsletter. With his help, I identified a technical setting in my account that would make my emails more recognizable to corporate email servers.

Here comes the kicker—my weekly subscription rate instantly doubled!

Some of those extra subscribers undoubtedly came from a marketing campaign, where I'm promising to send a PDF of my new book to anyone who is subscribed to the email by September 30, 2018.

But some of that huge increase was certainly due to this technical issue. And I never would have found it if I hadn't investigated the iceberg that came from just one email.

Discussion question: What do frontline employees do when they encounter a strange or unusual problem? Are they trained to search for and identify icebergs?

 

Invite Conversation

There are a few books that have absolutely changed the game for me. One was Kevin Kruse's book, Unlimited Clients.

A key piece of advice in the book was to invite conversation with your customers. The first version of the book had Kevin's phone number and email address right on the cover, and I can tell you from experience he actually responded!

So I took Kevin's advice and added a special invitation to the welcome email I sent to new subscribers. 

Excerpt from Customer Service Tip of the Week welcome email.

Subscribers have always been able to reply to any email and send a message directly to my personal email address. However, this invitation substantially increased the number of people who actually emailed me.

It's not everyone. (Thankfully—I don't know if I could keep up!) But a couple times a day I get an email from a new subscriber who tells me a little about themselves.

It helps me learn more about them and I often try to share something helpful in response. I've also learned those subscribers are more likely to share their feedback as they begin to receive the weekly tips.

Discussion Question: How can you invite individual customers to engage in a one-on-one conversation?

 

Catalog Unstructured Data

Something really amazing happens when you take all those individual conversations you have with customers and categorize them.

I went through hundreds of emails from subscribers and categorized the customer service challenges they shared with me. When I decided to put my weekly tips in a book, I put the top ten challenges in a chart and identified tips that could help with each one.

Going through several hundred emails may seem like a lot of work, but it really doesn't take that much time. I probably spent an hour or so. 

It goes even faster if you catalog feedback from individual customers as it comes in. A lot of customer service software platforms have a tagging feature that allows agents to do this on the fly. If your technology won't do it, you can have agents use a spreadsheet or even a piece of paper.

Discussion Question: How can you capture and analyze unstructured data?

 

Be a Customer

I learn a lot by subscribing to my own email.

This was a trick I learned from working in the catalog industry. Catalog companies would mail themselves a copy of each catalog so they could time how long it took to arrive and could verify each catalog arrived in good condition.

Subscribing to my own email allows me to do something similar.

For example, the Customer Service Tip of the Week goes out each Monday at 8:45 am Pacific time. One week, the email didn't arrive as expected. I double-checked the system and discovered I had set that particular email for 8:45 pm

Oops! Fortunately, I was able to quickly change the send time and the email went out only a few minutes later than normal.

Discussion Question: What can you learn from being your own customer?

 

Take Action

This post is a bit longer than normal, so here are all the discussion questions in one spot:

  1. What process do you have in place to allow your frontline agents to resolve or report problems?

  2. What do frontline employees do when they encounter a strange or unusual problem?

  3. How can you invite individual customers to engage in a one-on-one conversation?

  4. How can you capture and analyze unstructured data?

  5. What can you learn from being your own customer?

All of these questions can yield terrific customer feedback without ever resorting to a survey! Best of all, the feedback you get from these sources can often be quickly used to make improvements.

You can get five more survey alternatives from this old post.

And, if you really want to use a survey, my course on LinkedIn Learning can guide you. Here's a short preview.


Why You Should Stop Surveying Your Customers

What if you discovered your business was doing something that more than 25 percent of your customers disliked?

That should get your attention, though some businesses engage in unfriendly practices that bring in significant revenue. Think of airline baggage fees, hotel resort fees, and cable equipment rental fees. 

Okay, but what if you learned an activity that more than 25 percent of your customers disliked delivered absolutely no value to your business?

You'd probably stop it immediately.

The customer service survey falls into that category for many companies. Customers don't like it and it delivers absolutely no value. Smart customer service leaders should either fix their broken surveys or stop doing them altogether. 

Read on to learn which path you should take.

A team of professionals analyzes a customer service survey.

Customer Service Survey Drawbacks

A 2017 study from Customer Thermometer asked 1,000 customers to give their opinions on surveys by, you guessed it, surveying them.

  • 25 percent dislike being surveyed after a purchase
  • 47 percent dislike being prompted for feedback on a website
  • 43 percent dislike being surveyed in exchange for a contest entry

The caveat is an inherent bias in the results. The chances of you filling out a survey about surveys when you really don't like surveys is pretty low. So we could reasonably expect the positive results to be inflated.

In fact, 45 percent of respondents reported they routinely ignored survey requests.

Okay, so far the data shows that surveys annoy a lot of customers and nearly half of customers don't complete surveys, so they aren't representative of your customer population.

It gets worse.

A 2016 study from Interaction Metrics concluded that 68 percent of surveys from leading retailers were "total garbage," meaning the surveys yielded no useful information.

The kicker is a 2017 study from Capgemini Consulting revealed that companies improperly used Net Promoter Score (NPS) surveys saw no difference in customer perception compared to companies that did not track NPS or customer experience data.

The big question is whether it's worth the risk of annoying so many customers if your business is getting zero value out of your surveys.

 

How to Tell if Your Survey Generates Value

Think about the intention behind a customer service survey. This is what a survey plan should look like:

  • Generate meaningful insights
  • Use those insights to take action
  • Measurably improve the business through those actions

So you can start assessing the value by starting at the beginning. Does your survey generate any meaningful insights?

Here are just a few questions it might answer:

  • What makes your customers happy or unhappy?
  • What products, services, or locations are performing the best or worst?
  • What generates the most complaints?

Insight alone isn't enough. You'll need to actually take action. Examples include:

  • Fixing customer pain points
  • Reducing customer service waste (ex: repeat complaints)
  • Strengthening areas where customers are happy

Finally, you'll need to make sure those actions are generating measurable business results in some one. For instance:

  • Can you improve customer retention?
  • Can you serve customers more efficiently?
  • Can grow revenue through more word-of-mouth advertising?

These are all examples and by no means an exhaustive list. The bottom line is your survey needs to be a conduit to improving the business or else it's a waste of time.

 

Take Action

I've assembled a customer service survey resource page to help you learn more about what makes a great survey. You'll find blog posts and helpful videos.

Take time to evaluate your survey. If it's not driving value you'll have a big decision to make. Should you scrap it or fix it?


How to Find Trends in Your Survey Comments

The customer experience director proudly announced her company had just implemented a customer service survey. "That's great!" I said. "What are you doing with the data?"

There was an awkward silence. Finally, she replied, "Uh, we report the numbers in our regular executive meeting."

That was it. The entire purpose of the survey program was to add another meaningless number to the executive scorecard. The survey was doing nothing to help the company improve customer experience or service.

I dug a little deeper and discovered her survey had no comment section. In other words, customers could rate their experience but they couldn't explain why.

Comments are a critical element that tell you what your customers are thinking and what you need to do to improve. But having a comment section isn't enough.

You need to know how to analyze those comments. 

Why Survey Comments Matter

Let's take a moment to look at why survey comments matter. 

Imagine you manage a Discount Tire Store in San Diego. As of January 8, 2018, your store has a 4.5 star rating on Google from 83 reviews. (Side note: you can use Google My Business to attract more customers.)

tire store.jpeg

That's great news, but two big questions remain:

  • How did your store earn that rating? (You want to sustain it!)
  • What's preventing even better results? (You want to improve.)

The rating alone doesn't tell you very much. You need to look at the comments people write when they give those ratings to learn more.

The challenge is the comments are freeform. You'll need a way to quickly spot trends.

 

Analyze Survey Comments for Trends

The good news is you can do this by hand. It took me less than 30 minutes to do the analysis I'm going to show you.

Start with a check sheet. This is a piece of paper with a columns for each possible rating on the survey. I did this digitally by creating a table in Mac Pages.

checksheet1.jpeg

Next, read each survey comment and try to spot any themes that stand out as the reason the customer gave that rating. Record those themes on your check sheet in the column that matches the star rating for that review.

For example, what themes do you see in this five star review?

review1.jpeg

I recorded the following themes on my check sheet:

checksheet2.jpeg

Now repeat this for all of the reviews. Look for similar words or phrases that mean the same thing and put a check or star next to each theme that's repeated.

I noted a theme of "fast service" in the review above because the reviewer wrote, "got a full set of Yokohama tires in around an hour." I put a star next to "honest" and "fast service" after I read another review that said, "Discount Tire Store was trustworthy and fast. 4 new tires, in and out the door in an hour."

Once you've completed all of the reviews, tally up the themes that received the most mentions. Here are the top reasons people give a 5 star rating for this Discount Tire store:

  • Fast service: 72%
  • Good prices: 35%
  • Friendly employees: 23%

There weren't many bad reviews. The few that had comments mentioned a long wait time, a lack of trustworthiness, or some damage done to the customer's vehicle.

You'll see a larger theme emerge if you look across all the reviews.

Some aggravation usually accompanies a trip to the tire store. Maybe you got a flat tire or perhaps you're trying to squeeze in car service on a very busy day. There's a good chance you're dreading the cost.

When customers are happy, their comments tend describe some sort of relief. For instance, more than one customer mentioned arriving just before closing and feeling relieved to get great service from helpful and friendly employees.

 

Take Action!

The purpose of this exercise is to take action!

If I managed that Discount Tire store, I'd make sure employees understood they are in the relief business. (Perhaps they do, since their rating is so high!) Relief is one of the top emotions in customer support.

I'd also respond to negative reviews, like this one:

badreview.jpeg

Responding to a negative survey is an opportunity to save the customer. For private surveys, you'll need a non-anonymous survey or a contact opt-in feature to do this.

Many public rating platforms like Google My Business, Yelp, and TripAdvisor allow you to respond to customer reviews. A polite and helpful response can signal other customers that you care about service quality.

And you might save that customer, too. One Discount Tire customer changed his 1 star review to a 5 star review after speaking with the manager who apologized and fixed the issue!

You can watch me do another check sheet in this short video on LinkedIn Learning. (Email subscribers, you'll need to view the blog online to see it. Simply click on the article title at the top of the page.)


What is a Good Survey Response Rate?

It's the most common question I get about surveys.

Customer service leaders are understandably concerned about getting a lot of voice of customer feedback. So my clients want to know, "What is a good response rate for our customer service survey?" 

The answer may surprise you—there's no standard number. 

There are situations where an 80 percent response rate might be bad while a 5 percent response rate might be phenomenal in other circumstances.

In fact, I'm not overly concerned with the percentage of people who respond. My advice to clients is to use a different set of criteria for judging their survey responses.

Here's how to evaluate your own survey response rate the same way I do.

Three Response Rate Criteria

There are three criteria that you can use to determine if you're getting a good response to a customer service survey:

  • Usefulness
  • Representation
  • Reliability

Usefulness is the most important consideration.

Any response rate that provides useful customer feedback is good. That's not to say you can't do even better than your current rate, but the whole purpose of a customer service survey should be to yield useful data.

For example, let's say you implement a contact opt-in feature that allows you to follow-up with customers who leave negative feedback. That survey could become tremendously useful if it allows you to contact angry customers, fix problems, and reduce churn.

Representation is another important way to gauge your response rate.

You want your survey to represent all of the customers you are trying to get feedback from. Imagine you implement a new self-help feature on your website. A representative survey in this case would ask for feedback from customers who successfully used self-help as well as customers who weren't successful and had to try another channel.

Sometimes you need to augment your survey with other data sources to make it more representative. The authors of The Effortless Experience discuss the self-help scenario in their book and suggest having live agents ask customers if they first tried using self-help.

This question can help identify people who didn't realize self-help was available and therefore wouldn't complete a survey on its effectiveness. It could also capture feedback from people who tried self-help, were unsuccessful, and didn't notice a survey invitation because their priority was contacting a live agent to solve the problem.

My final criterion is reliability.

This means the survey can be relied upon to provide consistently accurate results. Here's a summary of considerations from a recent post on five characteristics of a powerful survey.

  1. Purpose. Have a clear reason for offering your survey.
  2. Format. Choose a format (CSAT, NPS, etc.) that matches your purpose.
  3. Questions. Avoid misleading questions.

Many surveys have problems in one or more of these areas. For instance, a 2016 study by Interaction Metrics discovered that 92 percent of surveys offered by the largest U.S. retailers asked leading questions that nudged customers to give a more positive answer.

For example, Ace Hardware had this question on its survey:

How satisfied were you with the speed of your checkout?

The problem with a question like this is it assumes the customer was satisfied. This assumptive wording makes a positive answer more likely.

A more neutral question might ask, "How would you rate the speed of your checkout?"

 

Resources

A survey response rate is good if it generates useful data, is representative of the customer base you want feedback from, and is reliable.

That doesn't mean you shouldn't strive to continuously improve your survey. Here are some resources to help you:

You'll need a Lynda.com or LinkedIn Premium subscription to view the full training video. You can get a 30-day Lynda.com trial here.


A Simple Way to Double Your B2C Survey Responses

Everyone wants a better survey response rate. The Center For Client Retention (TCFCR) recently shared some data about business-to-consumer (B2C) surveys that revealed an easy way to improve results.

TCFCR helps businesses conduct customer satisfaction research. The company's client focus is primarily Fortune 500 companies in business-to-business (B2B) and B2C segments.

There's a big need for these type of services given that a recent study from Interaction Metrics found 68 percent of surveys offered by America's largest retailers were "total garbage."

I provide similar services to small and mid-sized businesses, so I was curious to see what TCFCR's survey data might reveal.

One quick look and I immediately saw a way for businesses to double the response rate on their B2C surveys.

The Response Rate Secret

TCFCR pulled aggregate data from thousands of surveys across all of their clients for a 12-month period. The company compared response rates for "in the moment" surveys versus follow-up surveys sent via email. 

Here are the results:

Follow-up surveys had more than twice the average response rate!

An in the moment survey is offered at the time of service. It could be a link in an email response from a customer service rep, an after-call transfer to an automated survey, or a link in a chat dialog box.

A follow-up email survey is sent after the customer service interaction is complete.

TCFCR also found that sending a reminder email after the initial survey invitation typically generated an additional 5-point increase in response rates!

Some companies do follow-up surveys via telephone instead of email. TCFCR's data shows that those surveys get an average response rate of 12-15 percent, which is on par with in the moment surveys.

One thing to keep in mind is that this data is for B2C surveys only. TCFCR found that B2B surveys typically get a response rate that's half of what you'd expect from a B2C.

 

Increase Response Rates Even More

There are a few more things you can do to stack the deck in your favor.

One is to keep your surveys short. A 2011 study from SurveyMonkey found that survey completion rates drop 5-20 percent once a survey takes 7+ minutes to complete. The same study discovered that's usually around 10 questions.

Most surveys will gather adequate data with just three short questions.

Another way to improve response rates is through rules-based offering. A lot of customer service software platforms, such as Zendesk, have a built-in survey feature that allows you to adjust which customers receive a survey and when.

For instance, you might only send a follow-up survey once a support ticket is closed rather than after every single interaction. Or if you offer a subscription-based service, you might survey all customers when they reach the six month mark in their annual subscription, regardless of whether they've contacted your company for support.

You can learn more about response rates and other survey-related topics here.


The Powerful Survey Feature That Drives Customer Loyalty

Improving loyalty is a big reason companies survey customers.

The challenge is finding ways to actually accomplish that goal. Customer service leaders tell me confidentially that analyzing survey data is a struggle. Getting leaders to take meaningful action is another tough task.

There's one survey feature that can immediately improve your results. Seriously, you could implement it today and start reducing customer defections.

What is it? 

It's the contact opt-in. Here's a run-down on what it is, why it's essential, and how to implement it immediately.

What is a Contact Opt-In?

A contact opt-in is a feature at the end of your customer service survey that allows customers to opt-in for a follow-up contact.

The opt-in does three important things:

  • It allows you to follow-up with an upset customer and save their business.
  • The survey itself remains anonymous, which is important to some customers.
  • The opt-in doesn't promise a contact, it just gives you the option.

Best of all, it's really simple. Here's a sample opt-in:

May we contact you if we have additional questions?

Just make sure you add fields to capture a customer's name and contact information if they say yes!

 

Why are Follow-ups Essential?

There's a widely held perception among customers that surveys are meaningless.

That's because we're inundated with survey requests, but we rarely see any meaningful changes as a result of our feedback. Many customers are convinced their feedback is routinely ignored. (Spoiler alert: they're right.)

A follow-up tells customers you're listening. It demonstrates caring and empathy. Some customers have told me they were surprised and amazed to get a follow-up contact!

Now here's the best part: you might even be able to solve the problem and save the customer!

Data provided by the customer feedback analysis company, Thematic, shows that customers who give a "0" rating on Net Promoter Surveys have a lot more to say in the comment section than customers who give other ratings:

Data source: Thematic

Data source: Thematic

“Detractors across dozens of companies we’ve worked with complain about the inability to contact the company about an issue they have, lack of communication, or difficulty finding information on how to fix an issue themselves”, says Alyona Medelyan, CEO at Thematic. “We have also observed that many customers leave their full name, phone number or reference number in a free-text comment. Detractors are three times more likely to leave contact details than others.”

This presents customer service leaders with two choices:

You can ignore all that anger and wait for the customer to tell family, friends, and colleagues or you can contact the customer and try to iron things out.

 

How to Implement a Contact Opt-In

The process is very straight forward.

  1. Add a contact opt-in to the end of your survey.
  2. Review your survey for opt-ins (I recommend daily).
  3. Contact as many customers as possible, especially angry ones.

Through trial and error, I've found that a phone call often works better than email or other channels for following up. It's easier to have a dialogue if you catch them on the phone and a surprising number of customers will call you back if you leave a message and a phone number where they can call you directly.

Here are a few other tips:

  • Empower your follow-up person (or team) to resolve as many issues as possible.
  • Use customer conversations to learn more about their situation.
  • Summarize feedback from customer follow-ups to identify broad trends.

 

Conclusion

Some leaders worry about the time required. If that's your focus, your head's probably not in the right place.

Here are three compelling reasons why you definitely have the time:

  1. Follow-up is optional. You don't have to contact every single customer.
  2. Saving customers can directly generate revenue and reduce servicing costs.
  3. Fixing chronic problems leads to fewer customer complaints in the long run.

Here are some additional resources to help you turn your survey into a feedback-generating, customer-saving, money-making machine: