How do you evaluate customer service training?
Most trainers don't measure it beyond tracking attendance and giving people a smile survey at the end of every class. Employees attend training, then go back to work without any proof they learned something useful.
Some trainers use gut instinct.
For instance, a learner who asks fewer questions or displays greater confidence as the training goes along is considered trained. Never mind that some people who are completely incapable of doing the job ask no questions and display amazing amounts of confidence.
Still other trainers use quizzes. Participants are given tests to assess their knowledge of the content. The thought is a good quiz score indicates the person can do the job.
But can they?
I recently discussed the use of quizzes with Alexander Salas, Chief of Awesomeness at eLearning Launch, an online academy for instructional designers. We discussed the reasons why quizzes are often a poor way to measure training, and what trainers can do instead.
Here are a few questions Salas addressed in our conversation:
Why are quizzes a poor way to measure training?
What should we do instead of quizzing learners?
Why should companies avoid corporate universities?
What metrics should I use to evaluate training?
How do I justify my training programs to executives?
You can watch the full interview or read some of the highlights below.
Why are quizzes a poor way to measure training?
"In terms of workplace learning, you have to ask yourself why you are asking people to take a quiz," said Salas.
The goal of training is rarely for people to acquire and retain information. We usually give them information so they can use it to do a better job. That's where quizzes fall short—they don't show us whether an employee can do better work as a result of training.
Salas also discussed the challenge people have retaining information they learn in a training. A quiz might assess your knowledge level today, but two weeks later learners might have forgotten a lot of the knowledge they learned if it wasn't reinforced on the job in some way.
You can hear more on this topic at the 1:17 mark in the interview.
What should we be doing instead of quizzes to evaluate training?
Training should be evaluated by having participants demonstrate the performance you expect to see on the job. This can be done in a controlled training setting, or through on-the-job observations after training.
"Ideally, what you want to do is understand your purpose," said Salas. "What is your scope? Where are you ending?"
Salas argues that a quiz makes sense if your end goal is for people to have knowledge. Unless you're in academics, that's rarely the objective in the workplace.
In most cases, you really want people to be able to do something with that information. For example, if you want people to build better rapport with customers, then being able to identify rapport-building techniques on a multiple choice quiz is not enough.
Your evaluation plan should include participants demonstrating that they are able to build rapport, either in an in-class simulation or with real customers.
This evaluation process starts when you first design a training program.
Decide what a fully trained person looks like, and then work backwards to create a program to get people to that goal. That picture of a fully trained person should describe what the person should be able to demonstrate after the training is complete. (Here's a guide to help you do that.)
Go to 2:27 in the interview to hear more.
Why should companies avoid corporate universities?
Salas argues that too many companies mirror academia when they set up a training function. "What is school called in the business world? Training."
Training departments are often called corporate universities. Content is organized in a series of classes. Classes are often grouped into "certificate" programs to reward participants for completing a certain amount of content. Quizzes are used to assess learning, just like in school.
I once ran a corporate university that was set up this way. Being a results person, I studied whether taking a certain number of classes correlated with better job performance. There turned out to be no correlation at all.
Some people who attended every class were indeed successful, while others who attended every class were mediocre performers. There were other employees that never went to a class, yet were objectively high performers.
This insight caused me to scrap the corporate university approach.
What we did instead was focus on helping employees improve their job performance. We assessed employee skills gaps at an individual level, and created customized plans to help people grow.
Salas shares more at the 11:20 mark.
What metrics can we use to evaluate training?
"There's an evolution that you want professionals to go through," said Salas. "If they're beginners, and they're in customer service training programs, you want them to perform at a specific standard."
For example, if an employee is expected to respond to emails with a certain level of quality, there should be a clear standard that defines what a quality email looks like. Once that's defined, the employee's training should be evaluated by whether or not they can demonstrate the ability to write emails according to the quality standard.
More veteran employees might be evaluated a little differently. According to Salas, "You want them to progress to a level where they start creating their own improvements to the workflow, improvements to the way they do their work."
Using the same email example, you might evaluate an employee's learning by their contributions to updating or writing knowledge base articles that can help the entire team work faster and more accurately.
Go to 14:47 in the interview to hear more.
How do I justify my training programs to executives?
Salas suggests the process starts up front, when executives request training. "The question that you pose, when you get that request, is 'What do you want out of the training? Do you want performance, or do you want knowledge?'"
Trainers can then tailor the training program and evaluation strategy to meeting the executive sponsor's expectations.
We talk more about this at 17:39 in the interview.
Additional Resources
Salas provides elearning consulting at Style Learn and runs an online academy for instructional designers at eLearning Launch.
He's also a good person to follow on LinkedIn for content and though-provoking questions around training and instructional design.