Research to Action

The Global Guide to Research Impact

Navigation

  • Home

  • How To ▾

    This list of how to’s provides an essential guide for a number of key communication and engagement activities that will help make your research travel.

    • Building Capacity
    • Policy Briefs
    • Research Impact
    • Theory of Change
    • Uptake Strategy
  • Topics ▾

    • Eye on 2022
    • Impact Practitioners ▸
      • Impact Practitioners
    • Knowing your audience ▸
      • Building a strategy
      • Engaging policy audiences ▸
        • EBPDN
        • Targeting policy actors
        • Targeting practitioners
      • Stakeholder mapping
      • Strategic communication ▸
        • Building a brand
        • Engaging the public
      • Working with the media
    • Making your research accessible ▸
      • Framing challenges
      • Knowledge translation
      • Learning in context
      • Open access
      • Presenting your research
      • Using digital tools ▸
        • Using multi media
        • Using online tools/ICTs
        • Using social media
      • Using intermediaries
    • Monitoring and evaluation ▸
      • Applying M&E methods
      • Evidence into policy
      • Measuring success
    • Uncategorized
  • Dialogue Spaces ▾

    • GDN: Doing Research
    • Manchester Policy Week 2015
    • TTI Exchange 2015
    • Strengthening Institutions to Improve Public Expenditure Accountability (GDN PEM Project)
    • DFID/AusAid Research Communication and Uptake Workshop
    • 3ie Policy Influence and Monitoring (PIM) project
    • Policy Engagement and Communications (PEC) Programme
  • Reading Lists

  • Impact Practitioners

    • Impact Practitioners overview
    • Capacity Building
    • Communication and Engagement
    • Frameworks
    • Indicators
    • Learning
    • Monitoring and Evaluation
    • Policy Impact
    • Strategy
    • Theoretical
    • Utilisation

Social Media

  • Facebook
  • Twitter
  • Instagram
  • LinkedIn
  • Vimeo

Featured

From community-based livestock services to equitable knowledge ecosystems

By John Young 13/12/2018

New INASP Executive Director John Young shares how his journey from delivering livestock services to promoting the use of appropriate evidence in policy has demonstrated the importance of supporting the whole research and knowledge system, and his hopes for INASP going forward. (This blog was first published on INASP’s site.)

As I write this, I am just starting my fourth week at INASP. It has been great so far! One week with the team learning more about INASP’s fantastic work, one week in Bangkok for the final Think Tank Initiative Knowledge Exchange Conference, and one week meeting other people in Oxford involved in work on evidence for policy, and preparing for my first board meeting this week. So it seems like a good time to introduce myself.

I have spent about half of my professional life so far living in developing and emerging countries, developing and testing new approaches to livestock service delivery. For example, I worked on  community-based animal health care services in Kenya and a wider range of decentralized livestock services in Indonesia. Much of this work involved using evidence to support wider replication of the approaches that worked – trying to get the results of action research into policy and practice. The key lesson I took from this work was that, for innovative practices to spread, they need to meet a specific need; be socially, culturally, and economically attractive to local communities; and align with prevalent policies and practices. But it is possible to change prevalent policies and practices by engaging with policymakers and other stakeholders with the right kind of evidence at the right time and in the right way. A good example of that was the work of the International Livestock Research Institute’s work on the smallholder dairy sector in Kenya, of which colleagues and I did an evaluation in 2006.

It was seeing the potential of using evidence to inform policy and practice that encouraged me to join ODI, where I spent the second half of my career to date. ODI provided the opportunity to work more systematically on how to promote greater production and use of research-based evidence in development policy and practice, which I developed into ODI’s Research and Policy in Development Programme (RAPID). Our early work focused on understanding where, when, and why research-based evidence contributed to policy and practice, and developing a simple analytical and practical framework to help researchers maximize the usefulness and use of their work.

This led to work with research groups and research organizations keen to increase the impact of their work, and the development of the RAPID Outcome Mapping Approach to policy engagement and policy influence. At RAPID, we explored how to promote greater demand for research-based evidence from policymakers, looked at what kind of evidence is likely to be used and how to assess the impact of research-based projects on policy and practice. While the focus was predominantly on developing and emerging countries, we also explored evidence-informed policy in the UK. I have always been particularly interested in how to assess the impact of research-based projects on policy and practice, and did many evaluations including the Centre for International Forestry Research’s impact on REDD+ policies and programmes.

While there are many examples of policies that take research-based and other forms of evidence into account, for example the Tanzania Essential Health Interventions Project, there are many  others that do not. Or, worse, they use false evidence to justify decisions or claims that have already been made, like the ‘dodgy dossier’ justifying the invasion of Iraq in 2003 or, also in the UK, the ‘Brexit Bus‘. Similarly, there are many examples of research programmes that engage with policymakers and other stakeholders throughout, but also many others that only think about it afterwards. The challenge of ensuring that policy and practice are informed by rigorous evidence is a global issue.

The problem, I think, comes down to three Cs: lack of commitment, lack of capacity, and lack of cash. Incentives in research and policy organizations often discourage collaboration between researchers and policymakers, and there is a general slide away from evidence-informed policy back towards ideological policy that is increasingly based on populist nationalist public opinion in many developed countries. This is elegantly described in Lee McIntyre’s book ‘Post-Truth’. Reduced funding by donors on tertiary education in the 1990s and 2000s, and rapid, under-resourced expansion of universities in developing countries over the last 10 years has undermined the quality of tertiary education. There are not enough researchers and policymakers with the necessary critical thinking skills. Finally, research donors in both developed and developing countries are frequently reluctant to invest sufficiently in ‘all the other stuff’ that is necessary for the research they fund to be used. I will be writing more about these over the next few weeks.

INASP has over 25 years of experience strengthening the capacity of individuals and institutions to produce, share, and use research and knowledge in support of national development, and has a long-term commitment to supporting the production of high-quality research and the appropriate use of evidence in policymaking. Working in partnership with actors across the whole knowledge system and supporting organizational-level change, INASP’s work helps policymakers to use evidence, and researchers to produce and publish more. INASP’s commitment to supporting critical thinking skills in students and increasing equity, especially with gender, is contributing to developing equitable knowledge systems to tackle some of these challenges.

And this is why I am so excited to have joined INASP. There is a huge demand for this work across the developing world, and our challenge over the next 25 years is how to work with partners to scale this up. There is also, I think, a huge opportunity for INASP to work with others in the developed world to ensure that policies that affect livelihoods in developing countries are informed by research-based evidence too, and to strengthen the global movement for evidence-informed policy in an increasingly post-truth world.

Related posts

EBPDN: Refreshing recommended resources - 31/10/2019
Building momentum to advance citizen evidence in policymaking - 03/09/2019
Bringing researchers and knowledge brokers together for greater impact - 29/05/2019

Get 'New Post' e-alerts and follow R2A

> > > > >

Contribute to R2A:
We welcome blogposts, news about jobs, events or funding, and recommendations for great resources about development communications and research uptake.

Topics: communication, development, evidence-informed policy making, inasp, knowledge management, policy influence, research communication, research uptake

John Young

John joined INASP in November 2018. Prior to joining INASP he headed up the RAPID Programme at ODI focusing on research, advisory and public affairs work on the interface between research and policy for seven years. He was involved in projects on decentralisation and rural services, information systems, strengthening southern research capacity and research communication. He developed the RAPID programme into a global leader on the research-policy interface and developed its key approaches on policy influence (ROMA). He previously spent five years in Indonesia managing the DFID Decentralised Livestock Services in the Eastern Regions of Indonesia (DELIVERI) Project and was ITDG's Country Director in Kenya.

Contribute Write a blog post, post a job or event, recommend a resource

Partner with Us Are you an institution looking to increase your impact?

Tweets by @Research2Action

Most Recent Posts

  • Senior Associate – Communications and Engagement: Healthy Brains Global Initiative – Deadline 2nd June
  • Evaluating impact from research
  • Powered by Evidence podcast by GEI
  • Resource pool of knowledge management indicators and best practices for using them
  • Rapid Survey – complete by Friday 19th May!

This Week's Most Read

  • What do we mean by ‘impact’?
  • How to write actionable policy recommendations
  • 12ft Ladder: Making research accessible
  • Evaluating impact from research
  • Gap analysis for literature reviews and advancing useful knowledge
  • Policymaker, policy maker, or policy-maker?
  • Stakeholder Engagement a Tool to Measure Public Policy
  • Outcome Mapping: A Basic Introduction
  • Top tips: Writing newspaper opinion pieces
  • Key questions to ask when putting together a Theory of Change for Research Uptake (Part 1 of 2)

About Us

Research To Action (R2A) is a learning platform for anyone interested in maximising the impact of research and capturing evidence of impact.

The site publishes practical resources on a range of topics including research uptake, communications, policy influence and monitoring and evaluation. It captures the experiences of practitioners and researchers working on these topics and facilitates conversations between this global community through a range of social media platforms.

R2A is produced by a small editorial team, led by CommsConsult. We welcome suggestions for and contributions to the site.

  • Home
  • About Us
  • Cookies
  • Contribute

Subscribe to our newsletter!

Our contributors

  • Paula Fray
  • Shubha Jayaram
  • Sue Martin
  • Maria Balarin
  • James Harvey
  • Emily Hayter
  • Susan Koshy
  • Ronald Munatsi
  • Ajoy Datta

Browse all authors

Friends and partners

  • AuthorAid
  • Global Development Network (GDN)
  • INASP
  • Institute of Development Studies (IDS)
  • International Initiative for Impact Evaluation (3ie)
  • ODI RAPID
  • On Think Tanks
  • Politics & Ideas
  • Research for Development (R4D)
  • Research Impact

Copyright © 2023 Research to Action. All rights reserved. Log in