The international development community has been put on notice. The Center for Global Development asserts, ‘For decades development agencies have disbursed billions of dollars … Yet the shocking fact is that we have relatively little knowledge about the net impact of most of these programs’ (Savedoff and Levine 2006; CGD 2006).
The criticism is accompanied by a proposed minimum standard of knowledge: ‘To determine what works…. It is necessary to collect data to estimate what would have happened without the program … [only thus is it] … possible to measure the impact that can be attributed to the specific program’. The criticism also contained a note of despair; and it called for an independent evaluation entity to ensure rigour in the evaluation of development programmes.
This article reconsiders the veracity of the assertion of the ‘shocking fact’ for the Inter-American Development Bank (IADB), a multilateral Bank that lends to Latin American and Caribbean countries, and whether the Bank’s independent evaluation office, the Office of Evaluation and Oversight (OVE), has made any difference. The article also contributes to the discussion regarding these criticisms of the international development community’s lack of evaluative rigour. The article mainly documents the experience of the OVE in carrying out impact evaluations, the asserted minimum standard of knowledge.
The story’s relevance, however, is not limited to other evaluation offices of multilateral and bilateral organisations in the development community. The challenge faced by OVE, namely the ex post evaluations of projects that were not designed for impact evaluation and didn’t collect outcome data, is probably the most common challenge faced by evaluators. In addition, OVE’s experience adds to the growing evidence questioning the validity of the arguments against impact evaluations. The litany of arguments normally consists of: it is too difficult; it is too expensive; too few governments will agree; and there is no institutional mandate. Thus, the challenges faced by and the experience of OVE contribute to understanding the real-world approaches to impact evaluations.
This article comes from the IDS Bulletin 39.1 (2008) ‘You Can Get It If You Really Want’: Impact Evaluation Experience of the Office of Evaluation and Oversight of the Inter‐American Development Bank