Making your research accessible

Alex Gwyther (UKCDS) on supporting better research impact evaluation

By 30 June 2014

Funding research programmes is a core part of what many development organisations do: UKCDS members spend approximately £400 million a year on research covering everything from solar panel technology to economic analysis of labour policies. However, it can sometimes be incredibly difficult to speak on the impact of this investment and tease out the contribution actually made by research. UKCDS has written a set of guidelines to help those looking to navigate the mire of research impact evaluation.

Take, for example, the work of the International Development Research Centre (IDRC) of Canada. In the 1980s it supported research in fishery economics in South East Asia. By 1996, when this funding stopped, IDRC was supporting more than 80 researchers in 14 teams at research institutions and government fisheries agencies in Indonesia, Malaysia, Thailand, the Philippines and Vietnam. These individuals became leaders in fisheries, in government, and in research centres, bringing with them a culture of evidence based policy making around sustainable ecosystem management.

IDRC could see changes positively impacting on lives in Asia, but could not trace a direct causal claim to the research. To advocate for change, the fisheries researchers had used knowledge generated by IDRC funded research, but they had also worked with other people, brought in other evidence and engaged politically over a period of time after the research was completed. All of this was completely outside the purview and control of IDRC.

In October 2012 UKCDS, DFID and IDRC convened a workshop to help themselves and other funders explore and better understand the options for and challenges of evaluating the impact of their research programmes. Having the evidence to know what works effectively and what doesn’t is crucial to ensure future funding is invested well, represents value for money and maximises international development outcomes. Collecting that evidence, however, is notoriously difficult. The pathway from research to impact can be very long, and rarely follows a neat sequence of events. Whether research influences practice is often unpredictable – research findings are only one of the many factors that affect practice – and it often relies on ‘windows of opportunity’ when decision makers want new ideas.

This set of papers on ‘Evaluating the impact of research programmes’ was written to give an overview of the workshop’s discussion and act as a useful resource for those embarking on impact evaluation. It is not a comprehensive account but draws on the expertise of the participants to provide an understanding of the motivations and challenges around evaluation, as well as profiling the different methods and where they work well and what are their limitations. Different contexts will require different methodologies, and we hope this resource might act as a guide in identifying the right approach.

The available documents include:

UKCDS is a collaborative of 14 UK government departments and major funders with interests in research for international development. As part of our work on stimulating collaboration and ensuring the best research is funded and used to benefit development, we regularly bring our members together with other key organisations to share lessons learnt and best practice on research uptake and impact. We are always on the lookout for opportunities to enhance existing or future activities and share the wealth of knowledge of our membership, which is unique in itself – involving science and development ministries, major funding agencies and an independent research funder, all working together.

If you have any questions or comments on the evaluating impact resources, or UKCDS activities in general, please don’t hesitate to get in touch via