Скачать презентацию Evaluating HRD Programs Chapter 7 Werner De Скачать презентацию Evaluating HRD Programs Chapter 7 Werner De

caae501e044f1a32bb1b41402bc42fcc.ppt

  • Количество слайдов: 41

Evaluating HRD Programs Chapter 7 Werner & De. Simone (2006) 1 Evaluating HRD Programs Chapter 7 Werner & De. Simone (2006) 1

Learning Objectives w Define evaluation and explain its role/purpose in HRD. Compare different models Learning Objectives w Define evaluation and explain its role/purpose in HRD. Compare different models of evaluation. Discuss the various methods of data collection for HRD evaluation. Explain the role of research design in HRD evaluation. Describe the ethical issues involved in conducting HRD evaluation. Identify and explain the choices available for translating evaluation results into dollar terms. Werner & De. Simone (2006) 2

Effectiveness The degree to which a training (or other HRD program) achieves its intended Effectiveness The degree to which a training (or other HRD program) achieves its intended purpose Measures are relative to some starting point Measures how well the desired goal is achieved Werner & De. Simone (2006) 3

Evaluation Werner & De. Simone (2006) 4 Evaluation Werner & De. Simone (2006) 4

HRD Evaluation It is “the systematic collection of descriptive and judgmental information necessary to HRD Evaluation It is “the systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities. ” Werner & De. Simone (2006) 5

In Other Words… Are we training: the right people the right “stuff” the right In Other Words… Are we training: the right people the right “stuff” the right way with the right materials at the right time? Werner & De. Simone (2006) 6

Evaluation Needs Descriptive and judgmental information needed n Objective and subjective data Information gathered Evaluation Needs Descriptive and judgmental information needed n Objective and subjective data Information gathered according to a plan and in a desired format Gathered to provide decision making information Werner & De. Simone (2006) 7

Purposes of Evaluation Determine whether the program is meeting the intended objectives Identify strengths Purposes of Evaluation Determine whether the program is meeting the intended objectives Identify strengths and weaknesses Determine cost-benefit ratio Identify who benefited most or least Determine future participants Provide information for improving HRD programs Werner & De. Simone (2006) 8

Purposes of Evaluation – 2 Reinforce major points to be made Gather marketing information Purposes of Evaluation – 2 Reinforce major points to be made Gather marketing information Determine if training program is appropriate Establish management database Werner & De. Simone (2006) 9

Evaluation Bottom Line Is HRD a revenue contributor or a revenue user? Is HRD Evaluation Bottom Line Is HRD a revenue contributor or a revenue user? Is HRD credible to line and upper-level managers? Are benefits of HRD readily evident to all? Werner & De. Simone (2006) 10

How Often are HRD Evaluations Conducted? Not often enough!!! Frequently, only end-of-course participant reactions How Often are HRD Evaluations Conducted? Not often enough!!! Frequently, only end-of-course participant reactions are collected Transfer to the workplace is evaluated less frequently Werner & De. Simone (2006) 11

Why HRD Evaluations are Reluctance to having HRD programs evaluated Evaluation needs expertise and Why HRD Evaluations are Reluctance to having HRD programs evaluated Evaluation needs expertise and resources Factors other than HRD cause performance improvements – e. g. , n Economy n Equipment n Policies, etc. Werner & De. Simone (2006) 12

Need for HRD Evaluation Shows the value of HRD Provides metrics for HRD efficiency Need for HRD Evaluation Shows the value of HRD Provides metrics for HRD efficiency Demonstrates value-added approach for HRD Demonstrates accountability for HRD activities Werner & De. Simone (2006) 13

Make or Buy Evaluation “I bought it, therefore it is good. ” “Since it’s Make or Buy Evaluation “I bought it, therefore it is good. ” “Since it’s good, I don’t need to posttest. ” Who says it’s: n n Appropriate? Effective? Timely? Transferable to the workplace? Werner & De. Simone (2006) 14

Models and Frameworks of Evaluation Table 7 -1 lists six frameworks for evaluation The Models and Frameworks of Evaluation Table 7 -1 lists six frameworks for evaluation The most popular is that of D. Kirkpatrick: n n Reaction Learning Job Behavior Results Werner & De. Simone (2006) 15

Kirkpatrick’s Four Levels Reaction n Focus on trainee’s reactions Learning n Did they learn Kirkpatrick’s Four Levels Reaction n Focus on trainee’s reactions Learning n Did they learn what they were supposed to? Job Behavior n Was it used on job? Results n Did it improve the organization’s effectiveness? Werner & De. Simone (2006) 16

Issues Concerning Kirkpatrick’s Framework Most organizations don’t evaluate at all four levels Focuses only Issues Concerning Kirkpatrick’s Framework Most organizations don’t evaluate at all four levels Focuses only on post-training Doesn’t treat inter-stage improvements WHAT ARE YOUR THOUGHTS? Werner & De. Simone (2006) 17

Data Collection for HRD Evaluation Possible methods: Interviews Questionnaires Direct observation Written tests Simulation/Performance Data Collection for HRD Evaluation Possible methods: Interviews Questionnaires Direct observation Written tests Simulation/Performance tests Archival performance information Werner & De. Simone (2006) 18

Interviews Advantages: Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects Interviews Advantages: Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects High cost Face-to-face threat potential Labor intensive Trained observers needed Werner & De. Simone (2006) 19

Questionnaires Limitations: Advantages: Possible inaccurate Low cost to data administer Response conditions Honesty increased Questionnaires Limitations: Advantages: Possible inaccurate Low cost to data administer Response conditions Honesty increased not controlled Anonymity possible Respondents set Respondent sets the varying paces pace Uncontrolled return rate Variety of options Werner & De. Simone (2006) 20

Direct Observation Advantages: Nonthreatening Excellent way to measure behavior change Limitations: Possibly disruptive Reactive Direct Observation Advantages: Nonthreatening Excellent way to measure behavior change Limitations: Possibly disruptive Reactive effects are possible May be unreliable Need trained observers Werner & De. Simone (2006) 21

Written Tests Advantages: Low purchase cost Readily scored Quickly processed Easily administered Wide sampling Written Tests Advantages: Low purchase cost Readily scored Quickly processed Easily administered Wide sampling possible Limitations: May be threatening Possibly no relation to job performance Measures only cognitive learning Relies on norms Concern for racial/ ethnic bias Werner & De. Simone (2006) 22

Simulation/Performance Tests Advantages: Reliable Objective Close relation to job performance Includes cognitive, psychomotor and Simulation/Performance Tests Advantages: Reliable Objective Close relation to job performance Includes cognitive, psychomotor and affective domains Limitations: Time consuming Simulations often difficult to create High costs to development and use Werner & De. Simone (2006) 23

Archival Performance Data Advantages: Reliable Objective Job-based Easy to review Minimal reactive effects Limitations: Archival Performance Data Advantages: Reliable Objective Job-based Easy to review Minimal reactive effects Limitations: Criteria for keeping/ discarding records Information system discrepancies Indirect Not always usable Records prepared for other purposes Werner & De. Simone (2006) 24

Choosing Data Collection Methods Reliability n Consistency of results, and freedom from collection method Choosing Data Collection Methods Reliability n Consistency of results, and freedom from collection method bias and error Validity n Does the device measure what we want to measure? Practicality n Does it make sense in terms of the resources used to get the data? Werner & De. Simone (2006) 25

Type of Data Used/Needed Individual performance Systemwide performance Economic Werner & De. Simone (2006) Type of Data Used/Needed Individual performance Systemwide performance Economic Werner & De. Simone (2006) 26

Individual Performance Data Individual knowledge Individual behaviors Examples: n n Test scores Performance quantity, Individual Performance Data Individual knowledge Individual behaviors Examples: n n Test scores Performance quantity, quality, and timeliness Attendance records Attitudes Werner & De. Simone (2006) 27

Systemwide Performance Data Productivity Scrap/rework rates Customer satisfaction levels On-time performance levels Quality rates Systemwide Performance Data Productivity Scrap/rework rates Customer satisfaction levels On-time performance levels Quality rates and improvement rates Werner & De. Simone (2006) 28

Economic Data Profits Product liability claims Avoidance of penalties Market share Competitive position Return Economic Data Profits Product liability claims Avoidance of penalties Market share Competitive position Return on investment (ROI) Financial utility calculations Werner & De. Simone (2006) 29

Use of Self-Report Data Most common method Pre-training and post-training data Problems: n Mono-method Use of Self-Report Data Most common method Pre-training and post-training data Problems: n Mono-method bias w Desire to be consistent between tests n Socially desirable responses n Response Shift Bias: w Trainees adjust expectations to training Werner & De. Simone (2006) 30

Research Design Specifies in advance: the expected results of the study the methods of Research Design Specifies in advance: the expected results of the study the methods of data collection to be used how the data will be analyzed Werner & De. Simone (2006) 31

Assessing the Impact of HRD Money is the language of business. You MUST talk Assessing the Impact of HRD Money is the language of business. You MUST talk dollars, not HRD jargon. No one (except maybe you) cares about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data. ” Werner & De. Simone (2006) 32

HRD Program Assessment HRD programs and training are investments Line managers often see HR HRD Program Assessment HRD programs and training are investments Line managers often see HR and HRD as costs – i. e. , revenue users, not revenue producers You must prove your worth to the organization – n Or you’ll have to find another organization… Werner & De. Simone (2006) 33

Two Basic Methods for Assessing Financial Impact Evaluation of training costs Utility analysis Werner Two Basic Methods for Assessing Financial Impact Evaluation of training costs Utility analysis Werner & De. Simone (2006) 34

Evaluation of Training Costs Cost-benefit analysis n Compares cost of training to benefits gained Evaluation of Training Costs Cost-benefit analysis n Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sickdays, etc. Cost-effectiveness analysis n Focuses on increases in quality, reduction in scrap/rework, productivity, etc. Werner & De. Simone (2006) 35

Return on Investment Return on investment = Results/Costs Werner & De. Simone (2006) 36 Return on Investment Return on investment = Results/Costs Werner & De. Simone (2006) 36

Calculating Training Return On Investment Results Operational How Before After Results Area Measured Training Calculating Training Return On Investment Results Operational How Before After Results Area Measured Training Quality of panels % rejected Housekeeping Visual inspection using 20 -item checklist Differences Expressed 2% rejected Training 1. 5% rejected (+ or –). 5% in $ $720 per day 1, 440 panels per day 1, 080 panels per day 360 panels $172, 800 per year 2 defects (average) 8 defects 10 defects (average) Not measur able in $ Preventable accidents Number of accidents 24 per year 16 per year 8 per year Direct cost of each accident $144, 000 per year $96, 000 per year $48, 000 per year Total savings: $220, 800. 00 ROI = Return Investment = = Operational Results Training Costs $220, 800 $32, 564 = 6. 8 SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by permission. Werner & De. Simone (2006) 37

Measuring Benefits n n Change in quality per unit measured in dollars Reduction in Measuring Benefits n n Change in quality per unit measured in dollars Reduction in scrap/rework measured in dollar cost of labor and materials Reduction in preventable accidents measured in dollars ROI = Benefits/Training costs Werner & De. Simone (2006) 38

Ways to Improve HRD Assessment Walk the walk, talk the talk: MONEY Involve HRD Ways to Improve HRD Assessment Walk the walk, talk the talk: MONEY Involve HRD in strategic planning Involve management in HRD planning and estimation efforts n Gain mutual ownership Use credible and conservative estimates Share credit for successes and blame for failures Werner & De. Simone (2006) 39

HRD Evaluation Steps 1. Analyze needs. 2. Determine explicit evaluation strategy. 3. Insist on HRD Evaluation Steps 1. Analyze needs. 2. Determine explicit evaluation strategy. 3. Insist on specific and measurable training objectives. 4. Obtain participant reactions. 5. Develop criterion measures/instruments to measure results. 6. Plan and execute evaluation strategy. Werner & De. Simone (2006) 40

Summary Training results must be measured against costs Training must contribute to the “bottom Summary Training results must be measured against costs Training must contribute to the “bottom line” HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster Werner & De. Simone (2006) 41