Case study

Embedded learning:
Research, Evidence and Learning Component of ‘Making All Voices Count’ development programme

A comprehensive initiative, on an international scale, that optimised learning for adaptation within this award-winning programme over its life-cycle so as to maximise its relevance and impact. The learning contributed to a notable shift in attitude in the governance aid field from one of uncritical tech optimism to a more mature tech-realist stance.

Dates
June 2013 to November 2017
Approximate cost
£5,000,000
Format
Research, grant management, evidence generation, analysis, external communication, international learning events and capacity strengthening of programme staff.
Location
Online and in person in Ghana, Indonesia, Kenya, Philippines, South Africa, Tanzania, the Netherlands and the UK.

Objective

To fill evidence gaps about the use of technologies to enhance citizen voice and government responsiveness; bring a reflective learning perspective into tech innovation and scaling projects supported by Making All Voices Count; and optimise learning for adaptation within the programme over its life-cycle so as to maximise its relevance and impact.

Relevance

The rapid advent of technology applications to support citizen voice and government response, from about 2010 onwards, brought a corresponding rush to fund this sector among official and philanthropic donors alike. Many of the tech practitioners involved had no experience or knowledge of governance; many of the governance practitioners and researchers were engaging for the first time with new tech applications. In 2013 the sector was awash with tech optimism, but very little evidence or self-critical practice existed. The Research, Evidence and Learning Component of Making All Voices Count (MAVC) was designed to address this. MAVC was designed as a way to operationally support the Open Government Partnership (launched in 2011), and Open Government Partnership member countries and networks were major programme stakeholders and key among the learners addressed by the Research, Evidence and Learning Component.

Design

Delivering MAVC’s Research, Evidence and Learning Component reflected IDS’s expertise in conducting, managing and quality-assuring research on an international scale and in a cross-sectoral, multi-stakeholder setting.

The Component’s design centred on:

  • Generating evidence and analysis by conducting, commissioning and funding research on the tech innovation and scaling-up work supported by the MAVC development programme.
  • Communicating learning to key stakeholders, globally.
  • Strengthening the capacity of MAVC staff.

This involved:

  • Issuing and managing 47 research grants
  • Undertaking 14 research projects to distil learning from programme operations using a range of research approaches from quantitative analysis to action research
  • Publishing more than 90 outputs (journal articles, research reports, practice papers, programme learning reports, research reports, research summaries and videos)
  • Delivering four participatory and interactive international learning events to diverse learners (tech innovators, governance advocacy groups, grassroots organisations, local and national government officials, and researchers).

For a flavour of this work see:

Delivery

A team of programme managers, grant administrators and communications specialists ran the learning programme on a day-to-day basis.

IDS researchers and IDS Associates involved in delivering research and learning activities included:

Consultants and collaborators we worked with included:

  • Chris Michael, Collaborations for Change
  • Brendan Halloran, International Budget Partnership
  • Joy Aceron, G-Watch, Philippines
  • Professor Jonathan Fox, Accountability Research Center, American University
  • Tiago Peixoto, World Bank
  • Blair Glencourse, Accountability Lab
  • Lily Tsai, Massachusetts Institute of Technology
  • Rakesh Rajani, Twaweza
  • Lena Dencik, University of Cardiff
  • Indra de Lanerolle, Wits University
  • Jess Thindwa, World Bank
  • Joe Powell, Open Government Partnership.

Participant information

Typical participants attending the international learning events:

  • Independent tech developer
  • Local government official
  • National government planning officer
  • Accountability activist (in an NGO)
  • Policy researcher
  • Alternative media practitioner
  • Grassroots development worker

Participant numbers

  • More than 200 organisations and 500 people participated in the research work.

  • Approximately 180 individuals took part in the four international learning events.

  • Approximately 45 Making All Voices Count staff took part in the research and learning activities.

Impact

The Research, Evidence and Learning Component played a critical role in bringing forth lessons from earlier generations of accountable governance research and practice to reinforce the message that context is key, and in drawing MAVC stakeholders’ attention to relevant evidence inside and outside the programme.

Researchers and practitioners who implemented MAVC research grants and participated in learning events developed self-critical reflective perspectives on their work and the ability to generate actionable evidence from it.

MAVC staff learnt to use evidence from the programme in deciding what to fund and how best to support it.

The Research, Evidence and Learning Component of MAVC contributed to shifting the prevailing attitude in the governance aid field from one of uncritical tech optimism at the programme’s start, to a more mature tech-realist stance by the programme’s end. (For details, see the programme’s synthesis report: Appropriating technology for accountability: messages from Making All Voices Count.)

In 2017, MAVC won the prestigious Market Research Society’s President’s Medal for its extraordinary contribution to society through research.

Being new to this world, having spent many years in the private sector, I found every moment an inspiration. The learning sessions started to contextualise the scale of the problem we are addressing and I quickly realised that there are no ‘silver bullets’. The session on ‘Voices’ remains one of my highlights and opened my mind to the diverse methods, mechanisms and platforms that citizens use to express themselves. Engaging the innovators about their solutions and witnessing the passion with which they are embracing the challenges that lie ahead felt like jet-fuel was being injected into the Yowzit project and it inspired me to fast track a few actions on my return home to South Africa.
- Pramod Mohanlal, grantee, Yowzit Software
Key learning points for me were two-fold: (1) For the first time, I saw different ways of getting audience participation by making use of different techniques to ensure that all voices present, count. (2) The active engagement through the community-based field trip was a first for me, and it made me see how easily research design formulation from the view-point of the ‘classroom’ could be quite disconnected from those whose voices should count.
- Ome Mejabi, ICT Specialist, University of Ilorin
The key learning points for me came from the discussions I had with participants from varying fields and regions. Networking, as they call it.
- Asim Fayaz, GIC Winner, Bahawalpur Service Delivery Unit

Key contacts

About this case study

Related content