How to Increase Survey Response Rates by 370%

Andrew Gilliam, ITS Service Desk Consultant

Andrew Gilliam, ITS Service Desk Consultant

Small changes can often lead to big results.

Andrew Gilliam is an ITS Service Desk Consultant at Western Kentucky University. He improved the response rate to customer service surveys by 370 percent simply by changing the wording of the survey invitation email.

I interviewed Gilliam to learn about how he was able to do it. He provides a lot of helpful, actionable advice into this short, 20 minute interview. 

Topics we cover include:

  • Why you should survey both internal and external customers

  • What constitutes a "good" response rate

  • How to improve your survey invitation email

  • What types of customers typically complete surveys

  • Why you need feedback from angry, happy, and even neutral customers 

You can watch the full interview here. Follow Gilliam on Twitter at @ndytg or contact him via his website.


I Took Every Survey For a Week. The Results Weren't Good.

Customers are inundated with surveys.

We get them on receipts, via email, and in the mail. Shop somewhere and you're asked to take a survey. Don't shop somewhere, and a survey still appears. Visit a website and ping!, you get asked to take a survey.

I decided to take a week and do a small experiment. During that week, I would take every single survey I was asked to complete. The idea was to test three things:

  1. How many surveys would I be offered?

  2. Were any of the surveys well-designed?

  3. What was the experience like?

I was asked to complete 10 surveys during the week. That pencils out to over 500 surveys per year! No wonder customers experience survey fatigue.

Only one of the 10 surveys was well-designed. Every other survey had at least one glaring flaw, and most had multiple failures. More on that in a moment.

And what was my experience like? Most of the surveys backfired. The experience was so poor it made me like the company even less.

Person filling out a customer service survey to report a negative experience.

Surveys Are Too Difficult

When you ask a customer to take a survey, you're really asking the customer to do you a favor. A lot of the surveys I took made that favor really difficult.

Just accessing the surveys was a big challenge. 

My first survey request was on a receipt from the post office. The receipt had a QR code that I was able to quickly scan with my phone, but then the survey site itself was not optimized for mobile phones.

A survey from Dropbox wanted me to first read and acknowledge a confidentiality agreement before completing its survey.

Confidentiality agreement required to take the Dropbox survey.

The super odd thing was the confidentiality agreement had it's own survey! This extra bit of aggravation got even more annoying when the survey required me to fill out the comments box to explain my rating of the confidentiality agreement.

Survey requiring a comment.

Back to the first Dropbox survey, I had been working on it for 11 minutes in when I hit an infinite loop. None of the answers to a question applied to me, and it lacked a “Not Applicable” option for this required question. I felt I had put in enough time at that point and just gave up.

The survey invitation from Vons, my local grocery store, was a real piece of work. It was a receipt invitation, but there was no QR code, so I had to manually enter the web address. Then I had to enter a string of numbers along with my email address!

Vons survey invitation page, which requires an email address.

I couldn't complete two surveys due to errors. An email invitation from Chewy linked to a web page that I couldn't get to load. The Human Resources Certification Institute sent me a survey on May 24 that closed on May 23. Completing that survey is pretty low on the list of things I would do if I had access to a time machine.

Poor Survey Design

Beyond being difficult, just one of the ten surveys was designed well enough to provide useful, actionable, and unbiased information.

Many surveys were too long, which often triggers low completion rates. The Dropbox survey advertised it would take 15 minutes. (Who has that kind of time?!) These companies' surveys could easily be redesigned to get better data and higher completion rates from just three questions.

Many were full of leading questions designed to boost scores. This AutoZone survey arranged the rating scale with the positive response first, which is a subtle way to boost ratings. Like many of the surveys I took, there wasn't an option to leave comments and explain why I gave the ratings I did.

AutoZone customer service survey.

The survey from Vons was an odd choose your own adventure survey, where I got to decide which topic(s) I wanted to be surveyed on. 

Screenshot of multi-part customer service survey from Vons.

This created unnecessary friction and generated a little confusion since my biggest gripe on that particular visit was the large number of aisles blocked off by people stocking shelves. Is that a store issue, an employee issue, or a product issue? It’s a great example of where asking a customer to simply give a rating and then explain the rating would quickly get to the core of my dissatisfaction.

The One Good Example

The best survey was a Net Promoter Score (NPS) survey from Suunto. 

I received this survey invitation about six months after I registered a new watch on the Suunto website. NPS surveys measure a customer's intent to recommend, so giving me six months to use the watch before asking if I'd recommend it allows enough time for me to know what I like and don't like about the product.

Another positive was it asked just two questions: a rating and a comment. 

Suunto NPS survey.

Short surveys tend to have much higher completion rates than longer ones. Counterintuitively, you can almost always get more useful data from a short survey than a long and tedious survey. (More on that here.)

My question about the Suunto survey was whether the survey was linked to my contact information. This is necessary so someone from Suunto can follow-up with unhappy customers to learn more about the issues they're experiencing. (More on that here.)

Resources to Create Better Surveys

Here are some resources to help you avoid these mistakes and create better surveys.

You can also get step-by-step instructions for creating a great survey program by taking my customer service survey course on LinkedIn Learning.


Report: Why Retail Customer Service is Dropping

A new report from the American Customer Satisfaction Index shows a drop in retail customer satisfaction. From department stores like Nordstrom to specialty stores like Bed Bath & Beyond, customers are less happy than they were a year ago.

How can this be possible in an era where customers are bombarded with survey requests and access to big data is at an all-time high?

The answers have to do with people. How people are staffed, managed, and the duties they are asked to perform all have an impact on customer satisfaction.

You can access the full report or read below to see the highlights and analysis. To kick things off, the chart below shows a comparison in overall satisfaction between 2017 and 2018 on a 100-point scale:

Retail customer satisfaction declined from 2017 to 2018.

Retail customer satisfaction declined from 2017 to 2018.

Trend #1: Courtesy and Helpfulness of Staff

This one is down across the board.

Courtesy and helpfulness from retail employees has declined.

Courtesy and helpfulness from retail employees has declined.

Staffing levels have a big impact on this category. Retailers routinely understaff stores in an effort to save money, but this leaves the few available employees running ragged trying to serve multiple customers and complete tasks like restocking and merchandising.

Another issue is the surveys that seemingly appear on every retail receipt. These should help retailers detect problems like unfriendly employees. But the dirty secret is many retailers don't actually use those surveys to improve. And many even manipulate the surveys to make the scores look better than they really are.

A 2016 report from Interaction Metrics found that 68 percent of retail customer satisfaction surveys were "total garbage."


Trend #2: Layout and Cleanliness of Store

There's a slight dip in this area.

Stores need to improve the cleanliness and layout.

Stores need to improve the cleanliness and layout.

Part of the challenge is staffing (see Trend #1). Stores struggle to stay clean and organized when there aren't enough employees to do the work.

Another is command structure. Many retail chains make store layout decisions at the corporate level, and don't do enough field testing to ensure the designs actually make sense. Last year, I did a comparison of my local Walgreens, Rite Aid, and CVS and noted important differences in the layout of each store.


Trend #3: Speed of Checkout Process

The checkout process was another area where satisfaction dropped across the board.

Checking out is too slow at retail stores.

Checking out is too slow at retail stores.

Here again staffing plays a role. We've probably all wasted time wandering around a department store, searching for someone to ring us up. And that's precisely why so many people would rather shop online—it's much easier.

Customer satisfaction with speed isn't just about the actual amount of time it takes. People are heavily influenced by perception. So a pleasant experience with a friendly cashier that takes five minutes will feel like a breeze, while an unpleasant experience that also takes five minutes will feel like an eternity.

Retailers could help themselves by studying these factors that influence wait time perception.

Take Action

There are three easy ways retailers can check these trends in their own stores.

Talk to employees. I have no idea why managers don't spend more time doing this. Employees will almost always be forthcoming about the challenges they face if you ask them sincerely.

Walk your stores. Spend time walking through your stores like a customer. You'll often discover unexpected problems that your customers encounter every day.

Use surveys wisely. Customer feedback surveys can be valuable tools, but you should use them wisely or not use them at all. This short video will help you decide why you want to run a survey program.


Why You Need to Analyze Survey Comments

I'm putting the finishing touches on the second edition of my book, Getting Service Right. The book was originally called Service Failure, and I've now updated both the title and some of the research.

The cover is one of the most important sales tools for a book, so I worked with Anne Likes Red to come up with a few designs. I then launched a survey to ask readers for their feedback on three cover options. The survey was up for just a few days and a 135 people responded.

Here were the results:

Option A (28%)

GSR-3.jpg

Option B (52%)

Option C (20%)

Picking cover option B should be a no-brainer, right? After all, more than half of all survey respondents picked that option.

Without qualitative information, I might have made that mistake. Fortunately, I also included a comment field in the survey. When you analyze the comments to learn why someone chose a particular option, a new pattern emerges.


Searching for Themes

I recently hosted a webinar with Alyona Medelyan, CEO of the customer insight firm Thematic. Medelyan brought actual client data to reveal some interesting insights that a survey score alone wouldn’t show:

  • A cable company found customers with modem issues were impacting overall NPS by -2 points.

  • Another company discovered one variable that caused customers to spend $140 more per year.

  • An airline learned passengers were 4x angrier about missed connections than delayed flights.

The point Medelyan made is we usually get deeper, more actionable insights when we analyze the comments and not just the scores. So I applied this concept to my book cover survey and found two significant themes contained in the comments.

The first was quite a few people chose B because they liked the subtitle below the title better than the way it was shown in option A and C. So it wasn't just the color that's drove people to option B.

The second theme was quite a few people who selected option B mentioned they liked the title arrangement of option B, but preferred the color of option A. There were even a handful who picked B but mentioned they liked the color on option C best.

Suddenly option B isn't such a clear and convincing winner. Here's what happened when I revised the survey results to account for color choice alone:

Option A (40%)

Option B (39%)

Option C (21%)

Now I have two insights:

  • People prefer the blue cover shown option A

  • People like the title arrangement in option B

Keep in mind I only made adjustments where respondents were explicit in their survey comments. If someone didn't explain why they chose B, they may have done it for the title arrangement, the color, or pure whimsy.

Making a Final Decision

I did a similar survey with my last two book covers, and both times I ended up choosing elements from different options. I did the same thing this time.

Going with option B's title arrangement was a pretty easy decision. There were numerous comments describing option B as the preference without any support for the layout of options A and C.

I ultimately chose the blue color from option A. 

Several survey comments mentioned color theory, and my friend Jim even shared this helpful resource from Quick Sprout. According to the guide, the color blue symbolizes tranquilty and peace and has more positive associations across various cultures than purple and green.

The kicker is the blue is my personal preference. I really like it, and it's important for an author to really like the cover of their book! Here's the final cover:

It was also important to consider how the cover will look when shown together with my other books on Amazon, in a bookstore, or at a trade show. Here's how it will look displayed next to my other books:

Take Action

You can gain so much more from a survey if you combine the fixed choices (ex: option A, B, or C) with comments. Try analyzing one of your own surveys to see what hidden insight is revealed.

You’ll find a lot of simple analysis techniques in the webinar with Alyona Medelyan from Thematic.

You can also get more help with your survey on this survey resource page.


How to Get Customer Feedback Without a Survey

Advertising disclosure: We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

I frequently use subscriber feedback to improve my Customer Service Tip of the Week email newsletter. Yet I've never used a survey.

Customers are inundated with surveys, so it's important to think carefully before rolling out yet another one. With my newsletter, I've found I can get a lot of useful voice of customer feedback from several alternative sources.

Here are five ways I collect and use voice of customer feedback.

Business people sitting around a conference table analyzing survey data.

Issue Alerts

The weekly email will occasionally have a small issue such as a typo or a broken hyperlink. I try to proofread each email and test all the links, but problems occasionally do happen.

Typos are my kryptonite.

Thankfully, I can count on subscribers to let me know when there is an error. It's usually just a handful of people who email me about the problem, but that's all the feedback I need. Keep in mind most customers won't bother to tell you about small issues, but that doesn't mean they don't notice!

I have a process in place where I can flag a problem and fix it the next time I send out the same tip. In some cases, such as a broken hyperlink, I may re-send the email with the correction, although I try not to do this very often because I don't like swamping people's inboxes with extra emails.

Discussion question: What process do you have in place to allow your frontline agents to resolve or report problems?

 

Investigate Icebergs

A customer service iceberg is an issue that seems small and isolated on the surface, but is actually a sign of a much larger and more dangerous problem that's hidden from view.

Someone recently emailed me to let me know she had tried to sign-up for the Customer Service Tip of the Week email, but never received a confirmation. This was a classic iceberg because it was easy to dismiss the problem as a one-off where maybe she just missed the email or the confirmation wound up in a spam folder. 

I was tempted to just manually subscribe her to my list, but I decided to investigate. 

My research led me to a helpful exchange with a support agent at MailChimp, the company that powers my newsletter. With his help, I identified a technical setting in my account that would make my emails more recognizable to corporate email servers.

Here comes the kicker—my weekly subscription rate instantly doubled!

Some of those extra subscribers undoubtedly came from a marketing campaign, where I'm promising to send a PDF of my new book to anyone who is subscribed to the email by September 30, 2018.

But some of that huge increase was certainly due to this technical issue. And I never would have found it if I hadn't investigated the iceberg that came from just one email.

Discussion question: What do frontline employees do when they encounter a strange or unusual problem? Are they trained to search for and identify icebergs?

 

Invite Conversation

There are a few books that have absolutely changed the game for me. One was Kevin Kruse's book, Unlimited Clients.

A key piece of advice in the book was to invite conversation with your customers. The first version of the book had Kevin's phone number and email address right on the cover, and I can tell you from experience he actually responded!

So I took Kevin's advice and added a special invitation to the welcome email I sent to new subscribers. 

Excerpt from Customer Service Tip of the Week welcome email.

Subscribers have always been able to reply to any email and send a message directly to my personal email address. However, this invitation substantially increased the number of people who actually emailed me.

It's not everyone. (Thankfully—I don't know if I could keep up!) But a couple times a day I get an email from a new subscriber who tells me a little about themselves.

It helps me learn more about them and I often try to share something helpful in response. I've also learned those subscribers are more likely to share their feedback as they begin to receive the weekly tips.

Discussion Question: How can you invite individual customers to engage in a one-on-one conversation?

 

Catalog Unstructured Data

Something really amazing happens when you take all those individual conversations you have with customers and categorize them.

I went through hundreds of emails from subscribers and categorized the customer service challenges they shared with me. When I decided to put my weekly tips in a book, I put the top ten challenges in a chart and identified tips that could help with each one.

Going through several hundred emails may seem like a lot of work, but it really doesn't take that much time. I probably spent an hour or so. 

It goes even faster if you catalog feedback from individual customers as it comes in. A lot of customer service software platforms have a tagging feature that allows agents to do this on the fly. If your technology won't do it, you can have agents use a spreadsheet or even a piece of paper.

Discussion Question: How can you capture and analyze unstructured data?

 

Be a Customer

I learn a lot by subscribing to my own email.

This was a trick I learned from working in the catalog industry. Catalog companies would mail themselves a copy of each catalog so they could time how long it took to arrive and could verify each catalog arrived in good condition.

Subscribing to my own email allows me to do something similar.

For example, the Customer Service Tip of the Week goes out each Monday at 8:45 am Pacific time. One week, the email didn't arrive as expected. I double-checked the system and discovered I had set that particular email for 8:45 pm

Oops! Fortunately, I was able to quickly change the send time and the email went out only a few minutes later than normal.

Discussion Question: What can you learn from being your own customer?

 

Take Action

This post is a bit longer than normal, so here are all the discussion questions in one spot:

  1. What process do you have in place to allow your frontline agents to resolve or report problems?

  2. What do frontline employees do when they encounter a strange or unusual problem?

  3. How can you invite individual customers to engage in a one-on-one conversation?

  4. How can you capture and analyze unstructured data?

  5. What can you learn from being your own customer?

All of these questions can yield terrific customer feedback without ever resorting to a survey! Best of all, the feedback you get from these sources can often be quickly used to make improvements.

You can get five more survey alternatives from this old post.

And, if you really want to use a survey, my course on LinkedIn Learning can guide you. Here's a short preview.