Скачать презентацию Lessons Learned from OVC Evaluations for Future Public Скачать презентацию Lessons Learned from OVC Evaluations for Future Public

3c60f3b7512a673a50cd71f1da4ad5ca.ppt

  • Количество слайдов: 12

Lessons Learned from OVC Evaluations for Future Public Health Evaluations Siân Curtis, Ph. D Lessons Learned from OVC Evaluations for Future Public Health Evaluations Siân Curtis, Ph. D OVC Evaluation Dissemination Meeting, September 3 rd, 2009, Washington, DC

Growing Emphasis on Evaluation ■ IOM PEPFAR evaluation report and reauthorization legislation ■ Global Growing Emphasis on Evaluation ■ IOM PEPFAR evaluation report and reauthorization legislation ■ Global Fund 5 year impact evaluation and OR initiative ■ CFGD “When will we ever learn” report ■ 3 IE Initiative ■ IHP M&E Working Group – common evaluation framework initiative ■ USAID evaluation revitalization efforts

Ideal Impact Assessment Ideal Impact Assessment

Challenges to Implementing Rigorous Impact Evaluation § § § Need to think about evaluation Challenges to Implementing Rigorous Impact Evaluation § § § Need to think about evaluation at the beginning not at the end, but it is hard to attract attention at that point Timing – projects are already underway and it is hard to incorporate a strong evaluation design Scale – many projects are too small to expect to be able to demonstrate impact Pressure for rapid results to inform programs now Expectations of multiple stakeholders – scope, competing objectives, multiple/unclear research questions Political will – need someone in a position of authority to buy in and advocate for evaluation

Methodological Constraints to Rigorous Impact Evaluation ■ Non-random placement of programs - intervention areas Methodological Constraints to Rigorous Impact Evaluation ■ Non-random placement of programs - intervention areas and control areas often not comparable ■ Suitable control areas may not exist – other programs in control areas or cross-over of interventions to control areas ■ Need/ability to control for other factors beyond the program that might affect outcomes (Victora, Black, and Bryce 2009)

OVC Evaluation Experience ■ Timing ■ programs already underway – no baseline; post-test only OVC Evaluation Experience ■ Timing ■ programs already underway – no baseline; post-test only design; ■ Length and intensity of exposure - short duration of exposure (i. e. less 2 years) but impact likely to be longer term ■ Scale ■ coverage low in intervention areas – quality of beneficiary lists ■ some programs small

OVC Evaluation Experience ■ Pressure for rapid results ■ post-test only design; short program OVC Evaluation Experience ■ Pressure for rapid results ■ post-test only design; short program exposure. ■ Multiple stakeholder ■ Supports data use; ■ Managing expectations regarding scope and coverage of study ■ Political will/leadership – needs to be strong to facilitate buy-in from all stakeholders

OVC Evaluation Experience ■ Program participation was non-random ■ purposive selection of intervention areas OVC Evaluation Experience ■ Program participation was non-random ■ purposive selection of intervention areas ■ self-selection into (some) programs ■ controls different from beneficiaries. ■ Control areas – contamination between program and control areas i. e. some children in control areas reported receiving interventions

Additional Issues for OVC Evaluation ■ Multiple outcome domains – what to focus on? Additional Issues for OVC Evaluation ■ Multiple outcome domains – what to focus on? ■ Measurement tools for outcome domains vary in how widely tested they are and how well they work ■ Measurement and comparison of cost-effectiveness across multiple domains new ■ Lack of standardized interventions/variable intensity and quality ■ Wide variation in combination of interventions offered and way the programs are implemented

Data Use ■ Critical to think about data use throughout the evaluation process, not Data Use ■ Critical to think about data use throughout the evaluation process, not just at the end ■ Engagement of stakeholders critical to understanding the evaluation questions from different perspectives and creating ownership and demand ■ Proactive and explicit data use activities will help stakeholders understand apply findings recommendations from them better than from research team

Conclusions ■ Continuing challenge to develop pragmatic evaluation designs that meet rigorous scientific standards Conclusions ■ Continuing challenge to develop pragmatic evaluation designs that meet rigorous scientific standards within field realities – ongoing area of research ■ Recognize the long term benefits of evaluations for future programs – “public good” ■ Takes time for programs to scale-up and to have an effect – evaluations need to be ongoing. ■ More work needed to test measures of OVC outcomes – often multi-dimensional ■ Attention to data use (both short and long term) throughout process needed

MEASURE Evaluation is funded by the U. S. Agency for International Development through Cooperative MEASURE Evaluation is funded by the U. S. Agency for International Development through Cooperative Agreement GHA-A-00 -08 -00003 -00 and is implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill, in partnership with Futures Group International, ICF Macro, John Snow, Inc. , Management Sciences for Health, and Tulane University. The views expressed in this presentation do not necessarily reflect the views of USAID or the United States government. Visit us online at http: //www. cpc. unc. edu/measure.