How does research and knowledge impact on attitudes, behaviours, policy and practice? Whether you are a researcher, development studies student, NGO campaigner, policy wonk or research donor, here are some links that I’d recommend on this subject. Not because I agree with them necessarily but because they do a good job at picking at this issue and stimulating important debates relevant to those funding, producing and using research in their work.
However, if you think you are going to discover the secret of repeating the trick of turning evidence into action on these web pages [spoiler alert] prepare to be disappointed. Damn you complexity!
The naysayers and the disciples of impact
James Lloyd, Director of the Strategic Society Centre, writing for the fantastic LSE Impact Blog tells us it’s all a big fat waste of time. He says it’s unrealistic to expect research to drive policy change. Don’t tell our donors that Jimbo!
So here is the anti-dote to this challenging view from Oxfam’s Irene Guijt, who fresh back from the ESRC Impact Awards, feels inspired and makes a great case for both academic led and INGO led research impact. She was directly responding to Duncan Green’s own critique of what he believes is a slightly hopeless development studies sector. Needless to say we [the so called impact community] took the bait and drove even more traffic to his blog.
Tips from the research impact coal face
Less of this blogosphere posturing. What do people operating at the coal face of research to policy work really find to be the key issues? Some nice Vlogs pulled together by the ESRC DFID Impact Initiative give us a good idea what it’s actually like doing this stuff.
When you dig a little deeper you begin to find the best reflections on impact really do come from those who have direct experiences to share across a diverse range of research areas and geographies. I particularly liked the Leveraging Agriculture and Nutrition in South Asia’s development of research uptake self-assessment tool to aid learning around impact that can then be directly applied to their work. Simple yet affective.
Normally I am a bit suspicious of tool kits – just so many have appeared. In fact, the research impact agenda has spawned a whole sector of agencies and consultants offering services around research communications and engagement. Technical approaches alone don’t produce impact (i.e. teach researchers how to write policy briefs and everything will be ok) but clear guidance carefully delivered to groups of researchers can make a real difference. Louise Shaxson describes just how her team at ODI helped the DFID-ESRC Growth Research Programme revamp their impact plans.
Scientists and post-truth policy making
However, step back out of the cosy development research bubble of impact plans, stakeholder mapping workshops, policy briefs and donor impact tool kits and you have to face scary things like: alleged post-truth eras, fake news, media and politicians with no time for experts. I liked Andy Miah’s sensible advice in The Conversation for hard pressed scientists and how they should think about research communications in these turbulent times.
Show me the impact
Of course, once we have talked about what we actually mean by impact, how we might get there and what some of the real world challenges we may encounter on the way – researchers and donors [donors especially] tend to obsess mostly about how to prove it really happened at all. This in turn brings us to ideas around how flows of information are becoming increasingly fragmented and unpredictable, with larger and more diverse groups of actors influencing the policy and public debate. Paradoxically, the growing awareness of this complexity is placing ever-greater pressure on scientists to ‘have the answer’ and to respond more effectively to policy agendas in ways that demonstrate their tangible impact.
If you fancy dipping into this debate a bit deeper than a few blogs I recommend, check out The Politics of Evidence and Results by Rosalind Eyben et al which looks at these demands for shorter term measurable research impacts and plausible linear connections between individual research projects and changes in policy and practice.
Stop auditing and start learning
Digital tools in particular have been embraced to try and evidence impact. I liked this offering from the good old LSE Impact blog. What Nicolas Robinson-García et al seem to be getting at is that rather than trying to measure how much engagement some research received we should be looking at who is engaging and what this means for both our conceptual frameworks and the real day to day business of trying to improve development processes with research. Stop auditing and start learning they say. Sounds good to me.
These posts are just a tiny fraction of the good stuff out there. So please comment and tweet with your own favourite articles, blogs, multimedia and rants that try and explain this thing we call research impact.