How to Improve Training with Level One Feedback

Level one feedback is more commonly known as the survey you take at the end of a training program.

Some trainers derisively call these surveys "smile sheets" because they are often used for nothing more than confirming everyone had a great time. I must admit I haven't always put a lot of stock in them.

But I leaned heavily on level one feedback for a recent project.

My first full-length training video, Customer Service Foundations, launched on Lynda.com in 2014 and has garnered more than 2.4 million views. In late 2017, I was approached by the company and given the opportunity to update the course with a new version.

The revision included a tighter script, new scenes, and re-shooting the entire thing. Many of the revisions I made came directly from level one feedback. (You can see the finished course here.)

Here's what I did and how you can apply the same lessons to your next training project.

 Three participants evaluating a training program with a four, five, and three respectively.

A Quick Overview of Level Ones

The term "level one" comes from a training evaluation model attributed to Donald Kirkpatrick. It's one of four levels in the model:

  • Level 1 = Reaction
  • Level 2 = Learning
  • Level 3 = Behavior
  • Level 4 = Results

Kirkpatrick defines level one specifically as "the degree to which participants find the training favorable, engaging and relevant to their jobs." You can watch a short primer on the Kirkpatrick model here.

There's not a ton to be gained from level one evaluations in terms of actual learning. I know plenty of examples where participants had a great time in training only to go back to work and do absolutely nothing with it.

The real value is in product development. 

If participants like your training programs, find them engaging, and believe they are relevant, they are more likely to tell other people about their favorable experience. That becomes helpful word-of-mouth marketing.

So yes, a level one evaluation is really a customer service survey. 

 

Search Feedback for Themes

The starting point is to search participant feedback for themes, just like you would a customer service survey. I analyzed comments from thousands of survey results from this course. (You can read this primer on analyzing survey comments if you aren't sure how.) 

Overall, the feedback was very positive. People really liked the course, which helps explain its popularity. Some people did have some constructive feedback, and my analysis quickly revealed three clear themes:


Theme #1: The course is too long.
Sample comment: "great information, but, very lengthy and would not show completed in my tasks."

You have to be a bit of a detective when analyzing survey comments. That last part of the comment made me suspect the participant was more interested in getting credit for watching the entire course than they were in learning new skills.

It's good to follow-up on surveys and have a conversation with a sample group of participants. You will often get a lot more insight this way.

I talked to a lot of people who were watching my videos and discovered many watched the entire 1 hour 57 minute course from start to finish. The course is divided into short segments that are less than 5 minutes each, yet people just plowed all the way through.

I can see how that would be boring.

Here's a comment from a happy participant who used the training the way it was designed to be used:
"Great to have each segment short, so that you can take a little piece at a time."


Theme #2: Too basic
Sample comment: "If you have worked here or in customer services for any amount of time, 2 hours is an overkill. only took this class as it was mandatory."

The target audience for this course is new and inexperienced professionals. Even the title, Customer Service Foundations, implies this.

Some people, like this one, were really upset because they were mandated to take a course they didn't feel they needed. Other comments revealed people didn't clearly understand the course focused on the basics and was not intended to share more advanced skills.

Here's a comment from a happy participant who understood the target audience:
"Highly Recommended for customer service representatives with little to no experience."


Theme #3: Wish there was more detail on ___ topic
Sample comment: "There could be a bit more on serving difficult customers."

This one was a real challenge for two reasons. First, people tended to want more information on different topics. The second challenge was you can only squeeze so much content into one course. I really had to think about this one.

Here is a comment from a happy participant:
"The level of detail and easily relatable material greatly exceeded my expectations."


Notice I compared happy and unhappy participants for each of these themes. This provided some important context that told me, in general, people who didn't like the course were either taking the wrong course or taking the right course the wrong way.

 

Turn Feedback into Action

It's essential to use participant feedback to improve your course. The challenge is to make improvements without breaking the elements that people really like.

For example, if I added more detail in all the areas people requested (theme #3), the course would be even longer, which probably wouldn't go over well for the people who felt it was already too long (theme #1).

Here's what I did:

Fix #1: Shortened the course

I was able to shorten the new course by 25 percent.

 Run-time comparison of the new vs old customer service training video

A few tactics helped me do this:

  • Shortened scripts by getting to the point faster in each segment
  • Eliminated content that was non-essential
  • Spun off more detailed content into stand-alone courses

 

Fix #2: Created a how-to video

The new course kicks off with a welcome video and then moves to a short how-to video that explains who should watch the course and how to use it. It also explains that there are other courses available for people who want to take a deeper dive into specific topics.

You can watch the video here.

I also created an "Additional Resources" guide that participants could download which contained resources to explore specific topics in greater detail. The resources included books, blogs, podcasts, and even other training videos.

 

Fix #3: Created educational campaign

I've also created my own ongoing campaign to educate customer service leaders and employees on the best way to use these videos.

The campaign has included working with individual clients, sharing best practices with people I know are using the videos, and writing blog posts. Here are a few sample posts:

 

Take Action!

You can gain a lot from those level one training surveys if you think of your training participants as customers. Take a close look at their feedback and use it to make improvements.

To help you get started, I'm offering to review your current level one training evaluation at no cost or obligation. I really just want to help.

Here's how to take advantage:

  1. Email your evaluation form and your current evaluation plan to me at jeff [at] toistersolutions [dot] com.
  2. Send it by Friday, August 10 2018

That's it! I will review your evaluation form and plan and send you some feedback within one week.