How to Improve Training with Level One Feedback

Level one feedback is more commonly known as the survey you take at the end of a training program.

Some trainers derisively call these surveys "smile sheets" because they are often used for nothing more than confirming everyone had a great time. I must admit I haven't always put a lot of stock in them.

But I leaned heavily on level one feedback for a recent project.

My first full-length training video, Customer Service Foundations, launched on Lynda.com in 2014 and has garnered more than 2.4 million views. In late 2017, I was approached by the company and given the opportunity to update the course with a new version.

The revision included a tighter script, new scenes, and re-shooting the entire thing. Many of the revisions I made came directly from level one feedback. (You can see the finished course here.)

Here's what I did and how you can apply the same lessons to your next training project.

Three participants evaluating a training program with a four, five, and three respectively.

A Quick Overview of Level Ones

The term "level one" comes from a training evaluation model attributed to Donald Kirkpatrick. It's one of four levels in the model:

  • Level 1 = Reaction
  • Level 2 = Learning
  • Level 3 = Behavior
  • Level 4 = Results

Kirkpatrick defines level one specifically as "the degree to which participants find the training favorable, engaging and relevant to their jobs." You can watch a short primer on the Kirkpatrick model here.

There's not a ton to be gained from level one evaluations in terms of actual learning. I know plenty of examples where participants had a great time in training only to go back to work and do absolutely nothing with it.

The real value is in product development. 

If participants like your training programs, find them engaging, and believe they are relevant, they are more likely to tell other people about their favorable experience. That becomes helpful word-of-mouth marketing.

So yes, a level one evaluation is really a customer service survey. 

 

Search Feedback for Themes

The starting point is to search participant feedback for themes, just like you would a customer service survey. I analyzed comments from thousands of survey results from this course. (You can read this primer on analyzing survey comments if you aren't sure how.) 

Overall, the feedback was very positive. People really liked the course, which helps explain its popularity. Some people did have some constructive feedback, and my analysis quickly revealed three clear themes:


Theme #1: The course is too long.
Sample comment: "great information, but, very lengthy and would not show completed in my tasks."

You have to be a bit of a detective when analyzing survey comments. That last part of the comment made me suspect the participant was more interested in getting credit for watching the entire course than they were in learning new skills.

It's good to follow-up on surveys and have a conversation with a sample group of participants. You will often get a lot more insight this way.

I talked to a lot of people who were watching my videos and discovered many watched the entire 1 hour 57 minute course from start to finish. The course is divided into short segments that are less than 5 minutes each, yet people just plowed all the way through.

I can see how that would be boring.

Here's a comment from a happy participant who used the training the way it was designed to be used:
"Great to have each segment short, so that you can take a little piece at a time."


Theme #2: Too basic
Sample comment: "If you have worked here or in customer services for any amount of time, 2 hours is an overkill. only took this class as it was mandatory."

The target audience for this course is new and inexperienced professionals. Even the title, Customer Service Foundations, implies this.

Some people, like this one, were really upset because they were mandated to take a course they didn't feel they needed. Other comments revealed people didn't clearly understand the course focused on the basics and was not intended to share more advanced skills.

Here's a comment from a happy participant who understood the target audience:
"Highly Recommended for customer service representatives with little to no experience."


Theme #3: Wish there was more detail on ___ topic
Sample comment: "There could be a bit more on serving difficult customers."

This one was a real challenge for two reasons. First, people tended to want more information on different topics. The second challenge was you can only squeeze so much content into one course. I really had to think about this one.

Here is a comment from a happy participant:
"The level of detail and easily relatable material greatly exceeded my expectations."


Notice I compared happy and unhappy participants for each of these themes. This provided some important context that told me, in general, people who didn't like the course were either taking the wrong course or taking the right course the wrong way.

 

Turn Feedback into Action

It's essential to use participant feedback to improve your course. The challenge is to make improvements without breaking the elements that people really like.

For example, if I added more detail in all the areas people requested (theme #3), the course would be even longer, which probably wouldn't go over well for the people who felt it was already too long (theme #1).

Here's what I did:

Fix #1: Shortened the course

I was able to shorten the new course by 25 percent.

Run-time comparison of the new vs old customer service training video

A few tactics helped me do this:

  • Shortened scripts by getting to the point faster in each segment
  • Eliminated content that was non-essential
  • Spun off more detailed content into stand-alone courses

 

Fix #2: Created a how-to video

The new course kicks off with a welcome video and then moves to a short how-to video that explains who should watch the course and how to use it. It also explains that there are other courses available for people who want to take a deeper dive into specific topics.

You can watch the video here.

I also created an "Additional Resources" guide that participants could download which contained resources to explore specific topics in greater detail. The resources included books, blogs, podcasts, and even other training videos.

 

Fix #3: Created educational campaign

I've also created my own ongoing campaign to educate customer service leaders and employees on the best way to use these videos.

The campaign has included working with individual clients, sharing best practices with people I know are using the videos, and writing blog posts. Here are a few sample posts:

 

Take Action!

You can gain a lot from those level one training surveys if you think of your training participants as customers. Take a close look at their feedback and use it to make improvements.

To help you get started, I'm offering to review your current level one training evaluation at no cost or obligation. I really just want to help.

Here's how to take advantage:

  1. Email your evaluation form and your current evaluation plan to me at jeff [at] toistersolutions [dot] com.
  2. Send it by Friday, August 10 2018

That's it! I will review your evaluation form and plan and send you some feedback within one week.


Five Reasons Why You Should Evaluate Your Training Programs

There's one question I always ask project sponsors who request training. It's a bit of a show stopper because 90 percent of the time my client hasn't thought of the answer.

How will we evaluate the success of this program?

A good answer can drive results. For instance, let's say you want to train employees to better handle customer complaints. 

There are a whole host of questions you would need to ask before doing the training if you wanted to evaluate it:

  • What are customers complaining about?

  • What is a successful complaint resolution?

  • What are employees doing now?

  • What do we want employees to be doing instead?

  • What other factors besides training might influence complaint handling?

These questions can move you from generic training to a targeted intervention that actually reduces complaints and keeps customers happy.

Getting better results is just one reason why you should evaluate your training program. Here are five more.

Two professionals analyzing the results of a training evaluation report.

Why You Should Evaluate Training

Reason #1: Learn whether it works. Training is not always effective. One company spent tens of thousands of dollars on leadership training. Participants gave the course high ratings on post-training surveys and some even described it as "life changing." Yet a closer analysis revealed participants were not actually becoming better leaders as a result of the training. Funding for the program was eventually cut because there were no results to justify the cost.

Reason #2: Develop credibility. Customer service representatives were skeptical about a procedure they were being trained to use. They weren't convinced it would work until the trainer shared evaluation data from a pilot class that showed their colleagues had dramatically improved results using the new procedure. This gave the training greater credibility and the participants agreed to try using the new process.

Reason #3: Improve your programs. A client recently hired me to develop a customized customer service training program. We did a pilot session and it received excellent reviews, but our evaluation also identified a number of places where the program could be improved. The result was a much better program once it was introduced to all of my client's employees.

Reason #4: Meet sponsor expectations. The CEO of a small company asked me to conduct training to help customer service reps convert more inquiries into sales. The current conversion rate was 33 percent and the CEO felt employees could achieve 35 percent after the training. A post-training evaluation revealed the conversion rate rose to 45 percent, which made the CEO extremely happy!

Reason #5 Get more funding. A client hired me to conduct customer service training with her staff. They had received numerous complaints and she knew they needed to improve. We were able to demonstrate the training helped significantly reduce complaints and dramatically improve service levels, which allowed my client to get her boss to approve funding for additional training programs.

 

Learn More

Here's a short video that explains more about the importance of evaluating training.

Evaluating training programs requires more than just a short survey at the end of the class. Trainers sometimes call those "smile sheets" because they are really a customer satisfaction survey and not a robust evaluation tool.

The good news is evaluating training does not have to be overly difficult.

  1. Set clear training goals

  2. Create a measurement plan before training

  3. Execute your plan after training


3 Deadly Evaluation Mistakes That Can Destroy Your Training

It's budgeting season for many companies, which means your training programs may be at risk. 

Many of my clients are looking for cheaper ways to deliver customer service training. They're facing pressure from executives to cut costs, but they don't have hard data to prove their training program is working.

Others are trying to get new funding for expansion, but they're having an equally tough time making their case.

Forget the lofty platitudes like "training is an investment" or "it will help our employees grow." You'll need to back up those statements with some real numbers if you want them to fly in the c-suite.

Here are three deadly evaluation mistakes to avoid if you want to make a solid case.

Mistake #1: No goals

If your training program lacks goals, you're sunk.

It's impossible to evaluate training if you haven't set any goals that provide a target to evaluate your program against. I don't mean fluffy goals like "inspire employees to WOW customers" or some other platitude. Trust me, most executives find these worthless.

I'm talking about concrete goals that are set using the SMART model (Specific, Measurable, Attainable, Relevant, and Time-Bound). Here are some examples:

  • Customer service employees will reduce monthly escalations 15% by 12/31.
  • We will reduce customer churn by 10% by 1/31.
  • The Support Team will improve customer satisfaction 5 points by 2/28.

Setting goals often results in another important activity.

You need to have baseline measurements in place before you set a goal. It's pretty hard to reduce monthly escalations by 15 percent if you don't know how many escalations you have now, or why they're happening. So, the goal-setting process often forces managers to start measuring how their department brings value to the business.

 

Mistake #2: No linkages

Many training programs fail to link the training to the goals. Here's how a typical organization approaches training evaluation.

  1. Survey participants after the class
  2. ???
  3. Customer service improves

That part in the middle is absolutely critical. 

In his book, Telling Training's Story, Robert Brinkerhoff outlines a simple method called a Training Impact Model for making that critical connection. You do it by working backwards from business goals to the training itself.

  1. Establish business goals (see Mistake #1)
  2. Determine results needed from employees to achieve the goals.
  3. List actions needed to accomplish desired results.
  4. Identify knowledge and skills needed for those actions.

Here's an example for reducing escalations:

  • Goal: Reduce monthly escalations 15% by 12/31
  • Results: Resolve issues to customers' satisfaction without escalation
  • Actions: Apply the LAURA technique
  • Knowledge & Skills: Active listening, expressing empathy

So, my training in this case should focus on developing active listening skills and empathy. I'll want to set clear learning objectives using the A-B-C-D model so I can easily evaluate whether training participants have actually learned the right skills.. 

And, I'll also want to develop a workshop plan make sure employees aren't considered fully trained until I can observe them using the LAURA technique on the job.

 

Mistake #3: No financials

You'd better have some numbers if you're going to a budget meeting.

Many trainers are uncomfortable working with financials, so they avoid them. Or worse, they spout bogus metrics like telling people that ROI equals "Return on Inspiration." (Sadly, that's a true story.) 

Your CFO will laugh at you if you refer to ROI as Return on Inspiration.

You'll need to come correct with some real financial figures instead. Fortunately, this isn't too difficult if you've established clear goals that are linked to business results.

Let's go back to the escalations goal we've used as example. The sample goal was reduce monthly escalations 15% by 12/31.

Connecting those escalations to financial results should be easy. First, calculate the average cost of an escalation. There are a few places you might look:

Revenue: Look at how much your average customer spends (per order, per year, etc.) and compare that to how much customers with escalations spend after they have an issue that's been escalated. The escalations customers almost certainly spend less. Just for fun, lets say its $100 less per customer, per year.

Cost: Calculate the average cost of an escalation. For instance, if the average escalated call takes 15 minutes and is handled by someone making $20 per hour, then each escalation costs $5.

Projected Savings: Now, determine how much more money customers would spend and how much money you'd save with 15 percent fewer escalations. Prepare a nice report (showing your work) and share it with key stakeholders like your CFO.

The summary might look like this:

  • Each escalation costs $105 ($100 in lost revenue, $5 servicing cost)
  • A 15% reduction in escalation would equal 180 fewer escalations per year (based on 100 escalations per month).
  • $105 x 180 = $18,900 projected annual savings

 

Learn More

This short video provides five reasons why you should measure your training programs. 

It's part of the How to Measure Learning Effectiveness Course on lynda.com and LinkedIn Learning. You'll need a lynda.com or LinkedIn Premium subscription to view the course, but you can get a 10-day trial account on lynda.com.