
b5ffe337d89815ae88a74340b5634081.ppt
- Количество слайдов: 11
Pervasive methodological challenges in evaluation SAMEA Conference October 27, 2017 Jos Vaessen, Ph. D
1. Dealing with causality • “When will we ever learn” (CGD, 2006) • “Broadening the range of designs and methods for impact evaluations” (DFID, 2012) • There is no such thing as the best method for causal analysis; some may be better than others depending (among other things) on the causal questions • Three cheers for case-based analysis! 2
Different important causal questions that case Different important causal questions studies can help address Overall impact question Did the intervention make a difference? Specific impact question How much of a difference (on average)? For whom? Under what circumstances? How? Why so? Causal question Can we attribute the marginal (net) effect to the intervention? What is the net effect of other factors? What role did the intervention play in producing the outcome? What explains the outcome? Causal theory “Counterfactual” “Multiple conjunctural” “Generative” or “mechanism based” Methods e. g. , (quasi) Experiments, stat modeling e. g. , Pattern-matching, QCA, e. g. , Process tracing, in-depth case study Source: Befani, 2016, p. 20 3
2. Generalizing evaluative findings • How do we know if something works in one place, it also works in another: we don’t… • Yet, there are patterns of regularity that we can uncover (Merton, 1968, Pawson, 2010) • How do we design evaluations to ensure that analyses of specific activities or sites can generate generalizable lessons at the level of sector-level, country-level, thematic (etc. ) strategies or programs? 4
SAMPLING CONVERGENCE WITH THEORY/LITERATURE CONVERGENCE OF FINDINGS 5
Evaluators as ‘synthesizers’: repositories of knowledge on effectiveness • 3 ie : http: //www. 3 ieimpact. org/en/evidence/ • Campbell Collaboration : https: //www. campbellcollaboration. org/ • UK What works Network : https: //www. gov. uk/guidance/what-worksnetwork • EPPI : https: //eppi. ioe. ac. uk/cms/Default. aspx? tabid=3423 • What works clearing house : https: //ies. ed. gov/ncee/Wwc/ • Coalition for Evidence-Based Policy : http: //evidencebasedprograms. org/ • CLEAR: Clearing House for labor evaluations : https: //clear. dol. gov/ • Results First clearing house : http: //www. pewtrusts. org/en/multimedia/data -visualizations/2015/results-first-clearinghouse-database • Cochrane Library : http: //www. cochranelibrary. com/cochrane-database-ofsystematic-reviews/ 6
3. Dealing with bias resulting from ‘intervention-centric’ thinking • Policy interventions are embedded in complex socio-cultural, economic, political realities • Looking at these realities through the lens of a policy intervention is what evaluation is about, yet also has its limitations • Complexity science (including systems thinking) offers us useful tools to address this bias 7
Example of a ‘conventional’ program theory 8
Example of a systemic perspective of a policy intervention Source: Derwisch & Loewe (2015) 9
Social Network Analysis: WBG positioning in the health sector in Liberia Assessing working relationships between Ghanaian NGOs all funded by the same donor through an analysis of their progress report Financial Flows in the Health Sector in Liberia Knowledge leadership in the Health Sector in Liberia 10
Thank you for your attention! 11
b5ffe337d89815ae88a74340b5634081.ppt