CDI is a community of practice that works to broaden the range of evaluation designs and methods and ways of assessing what quality and rigour means when working in conditions of complexity. CDI helps to design, backstop and implement evaluations that reflect on the outcomes and effectiveness of development interventions.
At CDI, we value the appropriateness of evaluation design choices to the specific situation and stakeholder learning needs: what questions, in what context, about what intervention components, for what purpose, and within what resource constraints.
CDI is a community of practice that works to broaden the range of evaluation designs and methods and ways of assessing what quality and rigour means when working in conditions of complexity. CDI helps to design, backstop and implement evaluations that reflect on the outcomes and effectiveness of development interventions.
At CDI, we value the appropriateness of evaluation design choices to the specific situation and stakeholder learning needs: what questions, in what context, about what intervention components, for what purpose, and within what resource constraints.
We value ways in which the perspectives of many stakeholders, including the poor and most marginalised in society are reflected in evidence and how their lived experience can drive understanding and be amplified through the process of evaluating change (in design, data collection, analysis, and use of the evaluation). Different perspectives and their framings can enrich our understanding of the impact and empower people to seek more equitable change.
We value evaluations as a way to improve social accountability to citizens—knowledge and evidence shape power relations. The democratic gap between those making decisions and those affected by interventions is set to widen. We value ways in which evidence of impact can be used (and be part of deliberative processes) that increase accountability to those too often left behind.
Debates around impact evaluation have received renewed interest in recent years. Alongside a growing pressure on politicians and policymakers to demonstrate results and value for money, conventional evaluation approaches are critiqued for not living up to the learning and adaptive management much programming needs. In large part this is because they are too narrowly focused or unable to explain how interventions interact with other factors when generating outcomes in complex systems through often emergent and uncertain causal pathways.
Celebrating 10 years of CDI
As we celebrate 10 years of CDI we are excited to be launching a series of actions on the practice of bricolage. We invite you to reflect with core CDI partners IDS, Itad and UEA, and our broader CDI community of practice on the ways in which evaluators are mixing and matching methods to design and implement credible and useful evaluation. Evaluators often only adopt certain parts of methods, and skip or substitute recommended steps to suit their purposes. The evaluator may repurpose existing tools with those of methods and tools with which they are more familiar; or they may even combine a patchwork of relevant tools for different parts of an evaluation or throughout the cycle of designing, planning, monitoring, and evaluating a project.
Past actions
On 3rd October 2023, as part of the UKES 2023 Annual Conference Marina Apgar, Tom Aston (independent), Giovanna Voltolina (Itad) and Melanie Punton (Itad) used the framework for bricolage from CDI Practice Paper 24, to reflect on experiences with ensuring rigour in bricolage through two Itad evaluations.
During the AEA 2023 Conference Marina Apgar and colleagues from the Inclusive Rigour network explored experiences with bricolage through combining narrative and participatory methods in the practice of inclusive rigour.
In a series of blog posts, the authors of earlier CDI Practice Papers looked back and reflected on how they adapted or combined their methods in their evaluation practice.
Tom Aston delivered a keynote presentation at the Danish Evaluation Society on the recent evolution and increasing hybridisation of evaluation methods, and the benefits of bricolage.
Do you have experience with methodological bricolage you’d like to share? We’d love to hear from you so please get in touch with the Centre for Development Impact [email protected]. If you’d like to share your story through a CDI webinar see here for past events.
And, sign up to our newsletter and watch this space for new actions and ways to get involved such as an upcoming call for papers for a special issue on Bricolage in Practice.
South Africa. Mobile health team visiting rural communities. Credit: Giacomo Pirozzi / Panos
The Centre for Development Impact (CDI) organised a series of one-hour webinars about various impact evaluation methods within a theory-based evaluation approach. The webinars give a flavour of the various professional courses we organise.
The development sector proclaims that it values dignity. Yet it often breaks this promise, with people leaving encounters with charities feeling bruised and unseen. In this seminar, Tom Wein examines dignity as a core value around the world, drawing on…
This is the second blog in our reflective series. In our first blog we introduced the Full Spectrum Coalition (FSC) evidence and learning group, the challenge it responds to and the need to move beyond the performative dance that gets…
How might evaluation research respond to the complex and emergent nature of holistic community-led development? What does an equitable living partnership between evaluators and researchers, funders and programme implementers look and feel like? What are the highs and lows of…
Watch now https://www.youtube.com/watch?v=Hbminon-7x8 Questions about rigour, validity and credibility are central concerns of all evaluation practice. So too should be how we pay attention to meaningful participation to enable greater equity, especially when embracing complexity and seeking to achieve systems…
Are we making a difference? If so, how, and for who? These are questions we often ask about our policies, programmes and projects. Yet, they are often extremely challenging to answer. Understanding the ‘black box’ from ‘what we do’ to…
This paper reflects on the changes induced by modifications in the Rainforest Alliance certification system that require wage transparency from plantation owners, a comparison with the local living wage benchmark, and a wage improvement plan. The paper synthesises the findings…
Evaluation practitioners in the international development sector have given considerable attention in recent years to process tracing as a method for evaluating impact, including discussion of how to assess the relative importance of causal factors. Despite the increasing interest, there…
The RCT debate is often heated: it seems you can only either love or hate them. This seminar aims to spark a cooler discussion about the pros and cons of RCTs, using the practical examples of two studies on taxation…
With the financial support from the Dutch Ministry of Development Cooperation (DGIS), the 2SCALE programme started in June 2012, and is one of the largest incubators of inclusive agribusiness in sub-Saharan Africa. 2SCALE provides a range of support services to…
In 2012, DFID (now FCDO) put together the Private Enterprise Programme Ethiopia (PEPE) to respond to the second growth and transformation plan of the government of Ethiopia. Within the focus areas of this growth and transformation plan, PEPE prioritised sectors…
IDS is collaborating with the Partnership Resource Centre (PrC) in the Netherlands to evaluate scaling processes in inclusive business development. PrC is a strategic partner of 2SCALE - an incubator program funded by the Dutch Ministry of Foreign Affairs, which manages…
For this DfID-funded project, we will be providing a synthesis of the types of monitoring, evaluation and data being used by investors and the private sector in agricultural supply chains in a developing country context. Focussing on six value chains…