Opinion

New DFID guide on research uptake: 7 things to love and 7 to worry about

Published on 26 April 2016

James Georgalakis

Director of Evidence and Impact

The research uptake guide for DFID-funded research programmes has been updated. Amidst the controversy over the anti-advocacy clause and ongoing debate around how to strengthen evidence based development policy and programming, I wanted to share what I like about the new guide and what concerns me.

Things to love

1. Easy to understand and use

DFID is to be applauded for producing such a clear and accessible guide for researchers. It is easy to follow and provides practical steps that can be applied to a wide range of research programmes.

2. Emphasis on stakeholder mapping

There is good advice on stakeholder mapping and a clear explanation of why this is important.

3. Networking and multiple perspectives matter

The importance of evidence informed discussions is really emphasised and there is sound advice to facilitate discussions that include a range of perspectives including research other than your own. This touches on the crucial area of engaging with networks or building new ones and tapping into wider bodies of knowledge. All good stuff.

4. Capacity building is key

It is good to see how much DFID continue to see the value of building the capacity of research teams. Some nice example are given of how to approach this.

5. It is about more than just policy briefs

There is the expected focus on research communications with the emphasis on synthesis. Good to see policy briefs knocked off their perch as the be all and end all of research products. The key advice seems to be ‘keep it short’!

6. Research impact takes a long time

It was welcome to see that DFID recognise that research impacts are often far longer term than the research programme itself. So success indicators seem mainly to focus on accessibility of the evidence and the facilitation of evidence informed discussions. This more realistic conceptualisation of research programme outcomes is welcome. Perhaps it is time to start calling research uptake ‘knowledge exchange’ as many other donors do?

7. There’s more to read

There is a great set of references at the end and a simple to use checklist.

Things to be concerned about

 

1. Where is the engaged excellence?

DFID may have inadvertently committed the classic mistake of suggesting there is some kind of trade-off between research rigour and engagement. In the section on cost they say: “It is always better to generate high quality research and communicate it in a limited way than to produce low quality research and communicate it widely.” However, this strikes me as a false choice and may provide some with a carte blanche to dismiss impact work almost entirely. Instead we need to encourage research programmes to link quality with engagement and promote the co-production and co-communication of development research. At IDS we call this Engaged Excellence.

2. Where is the co-production?

Narrowly aligning research design to the needs of potential users feels like a missed opportunity to promote the co-production of research. The emphasis is on making your research useful for decision makers. This is in itself not bad advice but it fails to capture the potential for engagement with a wider constituency affected by the research agenda not just policy actors. This ironically may undermine some programmes’ capability to produce practical, scalable solutions to development challenges.

3. Decision makers are not the only stakeholder

Likewise, recommendations to seek ongoing engagement with policy makers fails to identify the benefits of strengthened relationships and networks spanning wider civil society movements, policy spaces and practitioners.

4. Scaring researchers away from policy engagement

The section on influencing is confusing which is hardly surprising given ongoing debates and uncertainty around the anti-advocacy clause due to be introduced in May. Just as we see the exclusion of universities’ research council grants from the clause but not government department funding (like DFID’s) the guide states that researchers should not be lobbying for specific policy or practice changes. However it goes onto to say you can make recommendations if they are based on the evidence.

Meanwhile, researchers are encouraged to engage in governmental consultations and other policy forums where we know policy actors will seek direct advice and recommendations. The guide also suggests that researchers exploit policy windows to maximise engagement. DFID admit the line between fostering discussion and influencing can be difficult to define. Confused? You should be! What we need are more incentives and training to enable social scientists to engage rigorous evidence with policy process, not unclear guidance that may put them off.

5. Development research needs Southern capacity building

External capacity building (capacity of users and intermediaries mostly) gets a mention but it comes with a warning that this is a hard thing to do. Perhaps this is indicative of the fact that DFID is currently refocusing its attention to more traditional research programmes away from this area. In a development research context so much more could and should have been included on both the building of Southern research capacity and strengthening capacity for accessing, appraising and using evidence in policy and practice.

6. Research uptake is not synonymous with research communications

Overall the guide creates the impression that research uptake is synonymous with research communications. Perhaps there needed to be more emphasis on the broader knowledge mobilisation (or ‘exchange’ if you prefer) process that includes strengthening networks, stimulating demand and building the capacity of knowledge intermediaries. Communications is really important but not the whole story.

7. We need to strengthen the links between research, policy and practice

It is good to see DFID briefly outline a range of different types of impact and not just instrumental ones on policy and practice. However the monitoring and evaluation section still feels slightly disappointing and incomplete. Having recommended a stakeholder map at the start of the programme they could have reflected on the Network Map model (pdf) that analyses the connections between the stakeholders and their attitudes to the research questions and then imagines what this map might look like at the end of the programme.

This is what is most absent from this guide – a clear message that changes to connectivity, networks, capacities and attitudes can be seen as the very valid objectives of a research programme and may be the most likely pathways to research uptake and impact.

Disclaimer
The views expressed in this opinion piece are those of the author/s and do not necessarily reflect the views or policies of IDS.

Share

Related content