Report: Why Retail Customer Service is Dropping

A new report from the American Customer Satisfaction Index shows a drop in retail customer satisfaction. From department stores like Nordstrom to specialty stores like Bed Bath & Beyond, customers are less happy than they were a year ago.

How can this be possible in an era where customers are bombarded with survey requests and access to big data is at an all-time high?

The answers have to do with people. How people are staffed, managed, and the duties they are asked to perform all have an impact on customer satisfaction.

You can access the full report or read below to see the highlights and analysis. To kick things off, the chart below shows a comparison in overall satisfaction between 2017 and 2018 on a 100-point scale:

Retail customer satisfaction declined from 2017 to 2018.

Retail customer satisfaction declined from 2017 to 2018.

Trend #1: Courtesy and Helpfulness of Staff

This one is down across the board.

Courtesy and helpfulness from retail employees has declined.

Courtesy and helpfulness from retail employees has declined.

Staffing levels have a big impact on this category. Retailers routinely understaff stores in an effort to save money, but this leaves the few available employees running ragged trying to serve multiple customers and complete tasks like restocking and merchandising.

Another issue is the surveys that seemingly appear on every retail receipt. These should help retailers detect problems like unfriendly employees. But the dirty secret is many retailers don't actually use those surveys to improve. And many even manipulate the surveys to make the scores look better than they really are.

A 2016 report from Interaction Metrics found that 68 percent of retail customer satisfaction surveys were "total garbage."


Trend #2: Layout and Cleanliness of Store

There's a slight dip in this area.

Stores need to improve the cleanliness and layout.

Stores need to improve the cleanliness and layout.

Part of the challenge is staffing (see Trend #1). Stores struggle to stay clean and organized when there aren't enough employees to do the work.

Another is command structure. Many retail chains make store layout decisions at the corporate level, and don't do enough field testing to ensure the designs actually make sense. Last year, I did a comparison of my local Walgreens, Rite Aid, and CVS and noted important differences in the layout of each store.


Trend #3: Speed of Checkout Process

The checkout process was another area where satisfaction dropped across the board.

Checking out is too slow at retail stores.

Checking out is too slow at retail stores.

Here again staffing plays a role. We've probably all wasted time wandering around a department store, searching for someone to ring us up. And that's precisely why so many people would rather shop online—it's much easier.

Customer satisfaction with speed isn't just about the actual amount of time it takes. People are heavily influenced by perception. So a pleasant experience with a friendly cashier that takes five minutes will feel like a breeze, while an unpleasant experience that also takes five minutes will feel like an eternity.

Retailers could help themselves by studying these factors that influence wait time perception.

Take Action

There are three easy ways retailers can check these trends in their own stores.

Talk to employees. I have no idea why managers don't spend more time doing this. Employees will almost always be forthcoming about the challenges they face if you ask them sincerely.

Walk your stores. Spend time walking through your stores like a customer. You'll often discover unexpected problems that your customers encounter every day.

Use surveys wisely. Customer feedback surveys can be valuable tools, but you should use them wisely or not use them at all. This short video will help you decide why you want to run a survey program.

Why You Need to Analyze Survey Comments

I'm putting the finishing touches on the second edition of my book, Getting Service Right. The book was originally called Service Failure, and I've now updated both the title and some of the research.

The cover is one of the most important sales tools for a book, so I worked with Anne Likes Red to come up with a few designs. I then launched a survey to ask readers for their feedback on three cover options. The survey was up for just a few days and a 135 people responded.

Here were the results:

Option A (28%)

GSR-3.jpg

Option B (52%)

Option C (20%)

Picking cover option B should be a no-brainer, right? After all, more than half of all survey respondents picked that option.

Without qualitative information, I might have made that mistake. Fortunately, I also included a comment field in the survey. When you analyze the comments to learn why someone chose a particular option, a new pattern emerges.


Searching for Themes

I recently hosted a webinar with Alyona Medelyan, CEO of the customer insight firm Thematic. Medelyan brought actual client data to reveal some interesting insights that a survey score alone wouldn’t show:

  • A cable company found customers with modem issues were impacting overall NPS by -2 points.

  • Another company discovered one variable that caused customers to spend $140 more per year.

  • An airline learned passengers were 4x angrier about missed connections than delayed flights.

The point Medelyan made is we usually get deeper, more actionable insights when we analyze the comments and not just the scores. So I applied this concept to my book cover survey and found two significant themes contained in the comments.

The first was quite a few people chose B because they liked the subtitle below the title better than the way it was shown in option A and C. So it wasn't just the color that's drove people to option B.

The second theme was quite a few people who selected option B mentioned they liked the title arrangement of option B, but preferred the color of option A. There were even a handful who picked B but mentioned they liked the color on option C best.

Suddenly option B isn't such a clear and convincing winner. Here's what happened when I revised the survey results to account for color choice alone:

Option A (40%)

Option B (39%)

Option C (21%)

Now I have two insights:

  • People prefer the blue cover shown option A

  • People like the title arrangement in option B

Keep in mind I only made adjustments where respondents were explicit in their survey comments. If someone didn't explain why they chose B, they may have done it for the title arrangement, the color, or pure whimsy.

Making a Final Decision

I did a similar survey with my last two book covers, and both times I ended up choosing elements from different options. I did the same thing this time.

Going with option B's title arrangement was a pretty easy decision. There were numerous comments describing option B as the preference without any support for the layout of options A and C.

I ultimately chose the blue color from option A. 

Several survey comments mentioned color theory, and my friend Jim even shared this helpful resource from Quick Sprout. According to the guide, the color blue symbolizes tranquilty and peace and has more positive associations across various cultures than purple and green.

The kicker is the blue is my personal preference. I really like it, and it's important for an author to really like the cover of their book! Here's the final cover:

It was also important to consider how the cover will look when shown together with my other books on Amazon, in a bookstore, or at a trade show. Here's how it will look displayed next to my other books:

Take Action

You can gain so much more from a survey if you combine the fixed choices (ex: option A, B, or C) with comments. Try analyzing one of your own surveys to see what hidden insight is revealed.

You’ll find a lot of simple analysis techniques in the webinar with Alyona Medelyan from Thematic.

You can also get more help with your survey on this survey resource page.


How to Get Customer Feedback Without a Survey

Advertising disclosure: We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

I frequently use subscriber feedback to improve my Customer Service Tip of the Week email newsletter. Yet I've never used a survey.

Customers are inundated with surveys, so it's important to think carefully before rolling out yet another one. With my newsletter, I've found I can get a lot of useful voice of customer feedback from several alternative sources.

Here are five ways I collect and use voice of customer feedback.

Business people sitting around a conference table analyzing survey data.

Issue Alerts

The weekly email will occasionally have a small issue such as a typo or a broken hyperlink. I try to proofread each email and test all the links, but problems occasionally do happen.

Typos are my kryptonite.

Thankfully, I can count on subscribers to let me know when there is an error. It's usually just a handful of people who email me about the problem, but that's all the feedback I need. Keep in mind most customers won't bother to tell you about small issues, but that doesn't mean they don't notice!

I have a process in place where I can flag a problem and fix it the next time I send out the same tip. In some cases, such as a broken hyperlink, I may re-send the email with the correction, although I try not to do this very often because I don't like swamping people's inboxes with extra emails.

Discussion question: What process do you have in place to allow your frontline agents to resolve or report problems?

 

Investigate Icebergs

A customer service iceberg is an issue that seems small and isolated on the surface, but is actually a sign of a much larger and more dangerous problem that's hidden from view.

Someone recently emailed me to let me know she had tried to sign-up for the Customer Service Tip of the Week email, but never received a confirmation. This was a classic iceberg because it was easy to dismiss the problem as a one-off where maybe she just missed the email or the confirmation wound up in a spam folder. 

I was tempted to just manually subscribe her to my list, but I decided to investigate. 

My research led me to a helpful exchange with a support agent at MailChimp, the company that powers my newsletter. With his help, I identified a technical setting in my account that would make my emails more recognizable to corporate email servers.

Here comes the kicker—my weekly subscription rate instantly doubled!

Some of those extra subscribers undoubtedly came from a marketing campaign, where I'm promising to send a PDF of my new book to anyone who is subscribed to the email by September 30, 2018.

But some of that huge increase was certainly due to this technical issue. And I never would have found it if I hadn't investigated the iceberg that came from just one email.

Discussion question: What do frontline employees do when they encounter a strange or unusual problem? Are they trained to search for and identify icebergs?

 

Invite Conversation

There are a few books that have absolutely changed the game for me. One was Kevin Kruse's book, Unlimited Clients.

A key piece of advice in the book was to invite conversation with your customers. The first version of the book had Kevin's phone number and email address right on the cover, and I can tell you from experience he actually responded!

So I took Kevin's advice and added a special invitation to the welcome email I sent to new subscribers. 

Excerpt from Customer Service Tip of the Week welcome email.

Subscribers have always been able to reply to any email and send a message directly to my personal email address. However, this invitation substantially increased the number of people who actually emailed me.

It's not everyone. (Thankfully—I don't know if I could keep up!) But a couple times a day I get an email from a new subscriber who tells me a little about themselves.

It helps me learn more about them and I often try to share something helpful in response. I've also learned those subscribers are more likely to share their feedback as they begin to receive the weekly tips.

Discussion Question: How can you invite individual customers to engage in a one-on-one conversation?

 

Catalog Unstructured Data

Something really amazing happens when you take all those individual conversations you have with customers and categorize them.

I went through hundreds of emails from subscribers and categorized the customer service challenges they shared with me. When I decided to put my weekly tips in a book, I put the top ten challenges in a chart and identified tips that could help with each one.

Going through several hundred emails may seem like a lot of work, but it really doesn't take that much time. I probably spent an hour or so. 

It goes even faster if you catalog feedback from individual customers as it comes in. A lot of customer service software platforms have a tagging feature that allows agents to do this on the fly. If your technology won't do it, you can have agents use a spreadsheet or even a piece of paper.

Discussion Question: How can you capture and analyze unstructured data?

 

Be a Customer

I learn a lot by subscribing to my own email.

This was a trick I learned from working in the catalog industry. Catalog companies would mail themselves a copy of each catalog so they could time how long it took to arrive and could verify each catalog arrived in good condition.

Subscribing to my own email allows me to do something similar.

For example, the Customer Service Tip of the Week goes out each Monday at 8:45 am Pacific time. One week, the email didn't arrive as expected. I double-checked the system and discovered I had set that particular email for 8:45 pm

Oops! Fortunately, I was able to quickly change the send time and the email went out only a few minutes later than normal.

Discussion Question: What can you learn from being your own customer?

 

Take Action

This post is a bit longer than normal, so here are all the discussion questions in one spot:

  1. What process do you have in place to allow your frontline agents to resolve or report problems?

  2. What do frontline employees do when they encounter a strange or unusual problem?

  3. How can you invite individual customers to engage in a one-on-one conversation?

  4. How can you capture and analyze unstructured data?

  5. What can you learn from being your own customer?

All of these questions can yield terrific customer feedback without ever resorting to a survey! Best of all, the feedback you get from these sources can often be quickly used to make improvements.

You can get five more survey alternatives from this old post.

And, if you really want to use a survey, my course on LinkedIn Learning can guide you. Here's a short preview.


Why You Should Stop Surveying Your Customers

What if you discovered your business was doing something that more than 25 percent of your customers disliked?

That should get your attention, though some businesses engage in unfriendly practices that bring in significant revenue. Think of airline baggage fees, hotel resort fees, and cable equipment rental fees. 

Okay, but what if you learned an activity that more than 25 percent of your customers disliked delivered absolutely no value to your business?

You'd probably stop it immediately.

The customer service survey falls into that category for many companies. Customers don't like it and it delivers absolutely no value. Smart customer service leaders should either fix their broken surveys or stop doing them altogether. 

Read on to learn which path you should take.

A team of professionals analyzes a customer service survey.

Customer Service Survey Drawbacks

A 2017 study from Customer Thermometer asked 1,000 customers to give their opinions on surveys by, you guessed it, surveying them.

  • 25 percent dislike being surveyed after a purchase
  • 47 percent dislike being prompted for feedback on a website
  • 43 percent dislike being surveyed in exchange for a contest entry

The caveat is an inherent bias in the results. The chances of you filling out a survey about surveys when you really don't like surveys is pretty low. So we could reasonably expect the positive results to be inflated.

In fact, 45 percent of respondents reported they routinely ignored survey requests.

Okay, so far the data shows that surveys annoy a lot of customers and nearly half of customers don't complete surveys, so they aren't representative of your customer population.

It gets worse.

A 2016 study from Interaction Metrics concluded that 68 percent of surveys from leading retailers were "total garbage," meaning the surveys yielded no useful information.

The kicker is a 2017 study from Capgemini Consulting revealed that companies improperly used Net Promoter Score (NPS) surveys saw no difference in customer perception compared to companies that did not track NPS or customer experience data.

The big question is whether it's worth the risk of annoying so many customers if your business is getting zero value out of your surveys.

 

How to Tell if Your Survey Generates Value

Think about the intention behind a customer service survey. This is what a survey plan should look like:

  • Generate meaningful insights
  • Use those insights to take action
  • Measurably improve the business through those actions

So you can start assessing the value by starting at the beginning. Does your survey generate any meaningful insights?

Here are just a few questions it might answer:

  • What makes your customers happy or unhappy?
  • What products, services, or locations are performing the best or worst?
  • What generates the most complaints?

Insight alone isn't enough. You'll need to actually take action. Examples include:

  • Fixing customer pain points
  • Reducing customer service waste (ex: repeat complaints)
  • Strengthening areas where customers are happy

Finally, you'll need to make sure those actions are generating measurable business results in some one. For instance:

  • Can you improve customer retention?
  • Can you serve customers more efficiently?
  • Can grow revenue through more word-of-mouth advertising?

These are all examples and by no means an exhaustive list. The bottom line is your survey needs to be a conduit to improving the business or else it's a waste of time.

 

Take Action

I've assembled a customer service survey resource page to help you learn more about what makes a great survey. You'll find blog posts and helpful videos.

Take time to evaluate your survey. If it's not driving value you'll have a big decision to make. Should you scrap it or fix it?


How to Find Trends in Your Survey Comments

The customer experience director proudly announced her company had just implemented a customer service survey. "That's great!" I said. "What are you doing with the data?"

There was an awkward silence. Finally, she replied, "Uh, we report the numbers in our regular executive meeting."

That was it. The entire purpose of the survey program was to add another meaningless number to the executive scorecard. The survey was doing nothing to help the company improve customer experience or service.

I dug a little deeper and discovered her survey had no comment section. In other words, customers could rate their experience but they couldn't explain why.

Comments are a critical element that tell you what your customers are thinking and what you need to do to improve. But having a comment section isn't enough.

You need to know how to analyze those comments. 

Why Survey Comments Matter

Let's take a moment to look at why survey comments matter. 

Imagine you manage a Discount Tire Store in San Diego. As of January 8, 2018, your store has a 4.5 star rating on Google from 83 reviews. (Side note: you can use Google My Business to attract more customers.)

tire store.jpeg

That's great news, but two big questions remain:

  • How did your store earn that rating? (You want to sustain it!)
  • What's preventing even better results? (You want to improve.)

The rating alone doesn't tell you very much. You need to look at the comments people write when they give those ratings to learn more.

The challenge is the comments are freeform. You'll need a way to quickly spot trends.

 

Analyze Survey Comments for Trends

The good news is you can do this by hand. It took me less than 30 minutes to do the analysis I'm going to show you.

Start with a check sheet. This is a piece of paper with a columns for each possible rating on the survey. I did this digitally by creating a table in Mac Pages.

checksheet1.jpeg

Next, read each survey comment and try to spot any themes that stand out as the reason the customer gave that rating. Record those themes on your check sheet in the column that matches the star rating for that review.

For example, what themes do you see in this five star review?

review1.jpeg

I recorded the following themes on my check sheet:

checksheet2.jpeg

Now repeat this for all of the reviews. Look for similar words or phrases that mean the same thing and put a check or star next to each theme that's repeated.

I noted a theme of "fast service" in the review above because the reviewer wrote, "got a full set of Yokohama tires in around an hour." I put a star next to "honest" and "fast service" after I read another review that said, "Discount Tire Store was trustworthy and fast. 4 new tires, in and out the door in an hour."

Once you've completed all of the reviews, tally up the themes that received the most mentions. Here are the top reasons people give a 5 star rating for this Discount Tire store:

  • Fast service: 72%
  • Good prices: 35%
  • Friendly employees: 23%

There weren't many bad reviews. The few that had comments mentioned a long wait time, a lack of trustworthiness, or some damage done to the customer's vehicle.

You'll see a larger theme emerge if you look across all the reviews.

Some aggravation usually accompanies a trip to the tire store. Maybe you got a flat tire or perhaps you're trying to squeeze in car service on a very busy day. There's a good chance you're dreading the cost.

When customers are happy, their comments tend describe some sort of relief. For instance, more than one customer mentioned arriving just before closing and feeling relieved to get great service from helpful and friendly employees.

 

Take Action!

The purpose of this exercise is to take action!

If I managed that Discount Tire store, I'd make sure employees understood they are in the relief business. (Perhaps they do, since their rating is so high!) Relief is one of the top emotions in customer support.

I'd also respond to negative reviews, like this one:

badreview.jpeg

Responding to a negative survey is an opportunity to save the customer. For private surveys, you'll need a non-anonymous survey or a contact opt-in feature to do this.

Many public rating platforms like Google My Business, Yelp, and TripAdvisor allow you to respond to customer reviews. A polite and helpful response can signal other customers that you care about service quality.

And you might save that customer, too. One Discount Tire customer changed his 1 star review to a 5 star review after speaking with the manager who apologized and fixed the issue!

You can watch me do another check sheet in this short video on LinkedIn Learning. (Email subscribers, you'll need to view the blog online to see it. Simply click on the article title at the top of the page.)