How We Designed Call-Center Training in 3 Days

A US company that provides HR services to other corporations decided to outsource its call center operations to India. They hired a training vendor to teach the basics of US corporate benefits to the Indian employees. This group spent a significant amount of time designing a comprehensive training package featuring hundreds of Powerpoint slides that focused on the content. When they delivered the training, they discovered that the participants were not learning. The Training Director at the HR Outsourcing Company concluded that the lack of learning was due to mysterious cultural factors and hired me (since as an Indian I could figure out what went wrong) to revise the training package.

A Proposal

After a cursory examination of the hefty training package, I agreed with my client that Indians would have difficulty learning from it. I went farther and suggested that human beings would have difficulty learning from it since the package was obviously designed for some form of alien intelligence. Rather that revising the existing package, I suggested that it would be faster, cheaper, and better to design training from scratch.

First Steps

We worked out metrics for defining successful training: Participants would answer typical customer questions related to corporate benefits clearly and accurately so that the number of repeat calls and errors would be significantly reduced.

I then asked the subject-matter experts to prepare a short introductory booklet on the topic around which I could build the training session. I let the SMEs fight it out among themselves and waited until they provided me with a crisp booklet. I did some minimal revision to make the booklet more readable.

I asked my client to send copies of the booklet to a pilot group of typical participants in India (including their trainers) and request them to read the book and become familiar with the content. I also had the local managers to warn these participants that there will be a workshop related to the content of the booklet in a week’s time. This workshop will not present the content all over again, but instead conduct a series of activities, quizzes, and performance tests that would require recall and application of the information.

Based on the content of booklet and the training objectives, I selected a set of potentially useful textra games (games that require and reward participants’ interaction with text materials).

Open Book

The pilot was conducted with the help of the client’s video conferencing center with its sophisticated technology. The session lasted for 4 hours with me in the US and the 17 participants in India. The entire session was automatically recorded on videotape.

I started with a textra activity called Open Book: I asked each participant to work individually, spend 10 minutes going through the booklet again, and write 10 recall questions. Each question was written on one side of the card and the answer on the back. This task provided a face-saving opportunity to those who had not taken their reading assignment seriously. I instructed the participants to distribute the 10 questions along different parts of the booklet.

After 10 minutes, I divided the group into four teams of four participants. I appointed the 17th participant who got left out as the Game Warden to help me facilitate the training session. I asked each team to pool all their questions, remove duplicates, and come up with a final set of 10 questions. The Game Warden collected the selected questions from the four teams, shuffled them, read one question, and selected a participant from one of the teams. If this person independently gave the correct answer, the team scored two points. If this participant consulted with the team and then gave the correct answer, the team received one point. If the answer was incorrect, the team lost a point. Using this simple approach, the Game Warden conducted a quiz contest distributing the questions equally among the teams. I observed the activity and made some on-the-spot changes to improve its instructional and motivational effectiveness.

Confused

During the Open Book game, I noticed that participants avoided asking questions from certain sections of the booklet. A quick inspection of the booklet revealed that these sections were difficult to understand. So I improvised a game called Confused. Once again, I asked each participant to independently write questions. But this time each participant wrote two questions, not for use in a quiz contest but for clarification from an SME. While the participants were busy, I rounded up the resident SME. After a suitable pause, I asked my Game Warden to collect the questions and mix them up. She then read one question at a time, skipping duplicates. The SME responded to each question. I asked participants to listen carefully and to take notes because there would be a follow-up activity. After about 15 minutes of this question-and-answer session, I asked each participant to write on a card one important principle from the SME’s answers that provided useful clarification. Participants exchanged the cards with each other several times (without reading what was on the card). I then asked the Game Warden to randomly select a few participants and have them read what was on their cards.

(Fast forward: After the session, we transcribed the questions and the SME’s responses. We edited them slightly and included them as an appendix to the next version of the booklet. However, we still continue to play Confused with subsequent groups because it seems to provide reinforcement and ownership to the participants.)

Best Answers

We continued conducting the training session, using various other textra activities that became increasingly job related. Throughout the session we permitted participants to refer to the booklet whenever they wanted to. After all, our goal was efficient retrieval of information rather than memorizing the content.

Here’s how another textra game called Best Answers worked: I asked an open-ended question (similar to the type of questions a customer may ask). Each participant wrote an appropriate answer on a card. The answers from the each team was collected and given to the next team. (The answer from the last team was given to the first team.) Each team now reviewed the four different answers and selected the best one based on such factors as accuracy, relevance, clarity, and brevity. Teams read the selected answers and identified the authors of these answers. Finally, we polled the participants to select the best of the four best answers.

Follow Up

After the pilot, we quickly reviewed the videotape recording, made a few revisions to the activities, and assembled instructions for conducting various activities. We sent this facilitator’s guide to the two trainers who participated in the pilot and discussed it on a conference call. These trainers facilitated subsequent training sessions. Evaluation data at different levels provided evidence of the effectiveness of the training package. The ultimate value of the package was demonstrated when we offered to design an intermediate level training session. Our counterparts in India said, “Don’t bother. Just send us a booklet with the content and we will use the review games we have already learned.”


Member Login
Welcome, (First Name)!

Forgot? Show
Log In
Enter Member Area
My Profile Not a member? Sign up. Log Out