fbpx

Case study

Specialist short course:
Participatory Monitoring and Evaluation for Learning

This specialist short course shared new frontier methods for quality implementation of participatory processes at scale, together with reflexive practice to engage in a new and exciting area of research and practice – participatory evaluation research. Since we began running the course in 2018, we have helped to build the capacity of more than 64 individuals to more effectively design and improve monitoring and evaluation systems supporting participatory and adaptive practice.

Dates
June 2018, September 2019, July 2021
Approximate cost
£1500
Format
Five-day, in-person course at IDS (2018 & 2019). Five-week online course (2021).
Location
Delivered at IDS in 2018 and 2019. Online in 2021.

Objective

The course aims to engage participants in thinking about design options and methodological opportunities for improving monitoring and evaluation (M&E) frameworks to move towards a participatory learning practice.

During the course we share a suite of innovative and cutting-edge participatory methods to help participants and the programmes they are involved in to learn about how the changes they desire to support (the impact sought) unfold through the experiences of those engaged directly in the change processes.

Relevance

Increasingly, donors and NGOs are using complexity aware, learning-based approaches to design and drive their M&E systems. In spite of much conceptual movement towards more learning-based approaches to M&E and adaptive programming, much practice continues to be based on linear, indicator driven methods that fail to capture learning about how change is unfolding or offer insights into how development interventions are contributing or inhibiting it.

Participatory approaches are widely recognised for their ability to deeply engage stakeholders and build meaningful ownership, however, have been heavily critiqued for being too localized. This is true of the early participatory M&E approaches espoused in the 1990s in community development. Methods are now available for quality implementation of participatory processes at scale (with large numbers of people and across broader geographical space) as well as for understanding how use of learning focused M&E can catalyse change processes and scale their outcomes.

Design

This short course builds on the deep historical experience with participatory approaches and methods that IDS has pioneered. It shares new frontier methods for quality implementation of participatory processes at scale together with reflexive practice to engage in a new and exciting area of research and practice – participatory evaluation research.

Both the online and in-person versions of the course combine lectures, plenary discussions, facilitated small group work, peer-learning and self-directed learning. These activities are supported by a reading list, and case study material. The online course includes 12 live facilitated sessions as well as pre-recorded lectures.

Since 2019, the course has been co-convened with Steff Deprez of Voices that Count. Steff brings in a wealth of practical experience in social impact measurement as well as cutting-edge thinking connected to the monitoring and evaluation spaces in Europe and beyond.

Delivery

The course is delivered by:

  • Marina Apgar, IDS Research Fellow
  • Steff Deprez, Development practitioner and co-Founder of Voices that Count
  • Mieke Snijder

Guest speakers and facilitators have included:

  • Sophie Pinwell, Clear Horizon Consulting
  • Dee Jupp, Associate Consultant, Empatika
  • Tiffany Fairey, Research Fellow, King’s College London

Participants

Mid- and senior-level development professionals working in government, NGO or community organisations who have some M&E and learning experience and have a particular interest in building more participatory, complexity-aware and adaptive processes.

Past participants have included:

  • Research and Learning Manager, Global Integrity, US
  • Director of Programmes, Swift Foundation, Peru
  • Monitoring Evaluation and Learning Manager, Save the Children, Jordan
  • Programme Manager, International Development Law Organisation, Kenya.

Participant numbers

  • Approximately 21 per course

  • 64 over the three years

Impact

Since we began running this course in 2018, we have helped to build the capacity of more than 64 individuals to more effectively design and improve monitoring and evaluation systems supporting participatory and adaptive practice.

Each course has enabled participants to develop their:

  • Understanding of where participatory and learning based approaches to M&E fit within broader approaches to evaluation
  • Knowledge of specific participatory methods and their application in M&E processes
  • Ability to critically interrogate and analyse methods
  • Ability to integrate methods in M&E design.

Each participant also developed a coherent plan to support personal or organisational goals relating to building a more participatory and complexity-aware processes.

A follow-up survey with participants in 2022, revealed wider impacts of the course. Respondents said the course had led to:

  • An expansion of their professional career and attainment of their career goals;
  • Organisational uptake of methods such as Outcome Mapping, Outcome Harvesting and Participatory Causal Mapping;
  • Strengthened organisational policies and frameworks for M&E; and
  • An opening-up of new markets for their organisations.

Ripple effects were achieved via one participant who shared their learnings from the course in a ‘training for trainers’ session aimed at strengthening knowledge on participatory evaluation methods with partner civil society organisations in Cambodia.

The course provided a supportive learning space, informed teaching and a rich diversity of participants all willing to share and listen – a wonderful opportunity to learn and reflect. Thank you.
It was a wonderful learning experience, well-shaped and considered, offering moments to be challenged, to introspect and to plan. I leave with much to bring to my work and my colleagues.
It was extremely validating, inspirational and gave us lots of practical ideas.

A short overview from the Learning Convenor

Key contacts

About this case study

Programmes and centres
Centre for Development Impact
Research themes
Evaluation Participation

Related content