Rethinking Impact Evaluation for Development
IDS Bulletin 45.6
Editor Befani, B., Barnett, C. and Stern, E.
View this publication
This IDS Bulletin presents a 'rallying cry' for impact evaluation to rise to the challenges of a post-MDG/post-2015 world. It is the first of two issues that follow a workshop entitled 'Impact, Innovation and Learning: Towards a Research and Practice Agenda for the Future', held at IDS in March 2013.
Convening a distinguished group of scholars and practitioners, this event situated development evaluation in general, and impact evaluation in particular, in the specific setting of today's complex and changing international development context. It aimed to sketch out a research and practice agenda to meet increasing demands for evidence about successful programmes and projects. Such evidence – needing to serve accountability and learning purposes while being accessible to recipients and donors – goes beyond innovation on research methods.
Methodological innovation is tightly linked to the new requirements of development impact evaluation; methods with the best current reputation are not necessarily the best at addressing the multiplicity of development outcomes, or the complex pathways towards long-term impact. This is fertile ground for a new research and practice agenda: one that can better enable impact evaluation to meet the new purposes of development cooperation; one that can innovate around methodological designs and practice to address increasingly complex challenges; and one that will help us better understand and improve evaluation systems.
The success of such an emerging agenda rests on whether we can make better use of evaluative evidence to have a real impact on the lives of the poorest and most marginalised.
Table of Contents
Introduction - Rethinking Impact Evaluation for Development (free to access)
Barbara Befani, Chris Barnett and Elliot Stern
Have Development Evaluators Been Fighting the Last War... And If So, What is to be Done?
Process Tracing and Contribution Analysis: A Combined Approach to Generative Causal Inference for Impact Evaluation
Barbara Befani and John Mayne
The Triviality of Measuring Ultimate Outcomes: Acknowledging the Span of Direct Influence
Giel Ton, Sietze Vellema and Lan Ge
Things you Wanted to Know about Bias in Evaluations but Never Dared to Think
Laura Camfield, Maren Duvendack and Richard Palmer-Jones
Making M&E More 'Impact-oriented': Illustrations from the UN
Jos Vaessen, Oscar Garcia and Juha I. Uitto
Some Thoughts on Development Evaluation Processes
Ole Winckler Andersen
Developing a Research Agenda for Impact Evaluation in Development
Patricia J. Rogers and Greet Peersman