커크패트릭 평가의 네 단계를 열어줄 일곱 개의 열쇠(Performance Improvement, 2006)

Seven Keys to Unlock the Four Levels of Evaluation

by Donald L. Kirkpatrick






First Key: Analyze Your Resources

To do this, you must answer the following questions:

Does your job consist of only one function—evaluating training programs—or does it include other and perhaps more important duties and responsibilities of planning the curriculum and teaching?

• How large a staff do you have for evaluation?

• How much of your budget can you spend on evaluating programs?

• How much help and cooperation can you get from other departments such as Human Resources or sales if you are evaluating sales training programs?

• How much support and help can you get from line managers if you are training their subordinates in programs such as Leadership Development for Supervisors?




Second Key: Involve Your Managers

If you are going to be effective in evaluating programs, you need to have your managers’ encouragement and support


1. Ask for their input in deciding on subject content. George Odiorne, in one of his books, made the following statement: “If you want people to support your decisions, give them a feeling of ownership.”


2. Get your managers to establish an encouraging climate regarding the program.


3. Ask for their helping evaluating the program. Levels 3 (Behavior) and 4 (Results) require this help. You can evaluate Levels 1 (Reaction) and 2 (Learning) without involving managers because you have control over these two levels. But Levels 3 and 4 are typically influenced by factors beyond those within your control




Third Key: Start at Level 1 (Reaction) and Continue Through Levels 2, 3, and 4 as Resources Permit


Some trainers or HPT professionals refer to Level 1 as “happiness ratings” or “smile sheets,” and I agree! That’s exactly what they are. They measure the reaction of the participants to the program. But those trainers also claim that these evaluations are not of much value. I disagree.


In business, industry, and government, there is a slight difference. First, they may not pay for the program, and the existence of the program doesn’t depend on their attendance. But you can be sure that they will be telling somebody—perhaps even their boss—whether they thought the program was worthwhile.


And when the reactions are positive, the chances of learning are improved.



Fourth Key: Evaluate Reaction

Here are the guidelines for evaluating Reaction:

1. Decide what you want to find out—make a list of items to which you want the reaction of the participants (i.e., subject content, leader’s effectiveness, schedule, audiovisual aids, handouts, case studies, facilities, meals, etc.).

2. Design a form that will quantify reaction. The most common form consists of a five point scale: either Excellent, Very Good, Good, Fair, and Poor; or Strongly Agree, Agree, Neutral, Disagree, and Strongly Disagree. The objective is to get as much information as possible in the shortest period of time. Participants are not eager to spend time writing the answers to questions.

3. Provide the opportunity for written comments. End your reaction sheet with the question, “What would have improved the program?”

4. Get 100% immediate response. When the participants leave the program, have them put their completed reaction sheet on the back table. Do not tell them to fill out the form and send it back, or do not ask them to email their reactions. If you do either of these, you will not get enough responses to represent the entire group.

5. Be sure you get “honest” answers. Tell participants you want their honest answers and do not ask them to sign the form.

6. Establish an acceptable standard for their combined reaction and tabulate the forms to see if you achieved or exceeded that standard.



Fifth Key: Evaluate Learning

Here are the guidelines for evaluating Learning:

1. Measure before and after knowledge, skills, and attitudes.

2. Use a form the participants can complete for evaluating knowledge and attitude change.

3. Use a performance test for evaluating skills.

4. Get 100% response.

5. For knowledge and attitudes, design a test that measures what you want them to know and the attitudes you want them to have at the end of the program. 

6. A question that usually arises about the pretest and posttest is whether the same form can be used or if “Form A” and “Form B” should be developed. There are too many problems when you try to develop a “Form A” and “Form B” that will cover the same knowledge and attitudes. So use the same form.



Sixth Key: Evaluate Behavior

Here are the guidelines for evaluating Behavior:

1. Measure on a before-and-after basis if practical. If this is not practical, the alternative is to measure after the program and ask, “What changes in behavior have occurred since you attended the program?”

2. Allow time for behavior change to take place. 3 months or 6 months after the program, or maybe never. The best compromise seems to be 3 months after the program.

3. Use a patterned interview or written survey asking the same questions of all respondents. One important question to include is, “Do you plan to change your behavior in the future?”

4. Decide who will be polled. For example, the following options are possible:

• The participants

• The bosses of the participant

• The subordinates of participants

5. Based on the fact that some participants have not changed their behavior but did answer positively the question, “Do you plan to change your behavior in the future?” repeat the research after 3 more months.



Seventh Key: Evaluate Results

Here are the guidelines for evaluating Results:

1. Measure on a before-and-after basis.

2. Allow time for results to develop—perhaps 6 months or a year.

3. Repeat at appropriate times.

4. Use a control group if practical. A “control” group is individuals who did not attend the program. An “experimental” group is the participants. U








Seven keys to unlock the four levels of evaluation

  1. Donald L. Kirkpatrick

Article first published online: 10 AUG 2006

DOI: 10.1002/pfi.2006.4930450702

Copyright © 2006 International Society for Performance Improvement

+ Recent posts