Research to Action

The Global Guide to Research Impact

Navigation

  • Home

  • How To ▾

    This list of how to’s provides an essential guide for a number of key communication and engagement activities that will help make your research travel.

    • Building Capacity
    • Policy Briefs
    • Research Impact
    • Theory of Change
    • Uptake Strategy
  • Topics ▾

    • AEN Evidence 23
    • Eye on 2022
    • Impact Practitioners
    • Knowing your audience ▸
      • Building a strategy
      • Engaging policy audiences ▸
        • EBPDN
        • Targeting policy actors
        • Targeting practitioners
      • Stakeholder mapping
      • Strategic communication ▸
        • Building a brand
        • Engaging the public
      • Working with the media
    • Making your research accessible ▸
      • Framing challenges
      • Knowledge translation
      • Learning in context
      • Open access
      • Presenting your research
      • Using digital tools ▸
        • Using multi media
        • Using online tools/ICTs
        • Using social media
      • Using intermediaries
    • Monitoring and evaluation ▸
      • Applying M&E methods
      • Evidence into policy
      • Measuring success
    • Uncategorized
  • Dialogue Spaces ▾

    • Youth Inclusion and Engagement Space
    • AEN Evidence
    • GDN: Doing Research
    • Manchester Policy Week 2015
    • TTI Exchange 2015
    • Strengthening Institutions to Improve Public Expenditure Accountability (GDN PEM Project)
    • DFID/AusAid Research Communication and Uptake Workshop
    • 3ie Policy Influence and Monitoring (PIM) project
    • Policy Engagement and Communications (PEC) Programme
  • Reading Lists

  • Impact Practitioners

    • Impact Practitioners overview
    • Capacity Building
    • Communication and Engagement
    • Frameworks
    • Indicators
    • Learning
    • Monitoring and Evaluation
    • Policy Impact
    • Strategy
    • Theoretical
    • Utilisation

Social Media

  • Facebook
  • Twitter
  • Instagram
  • LinkedIn
  • Vimeo

Featured

4 tips for planning your policy research M&E

By Tiina Pasanen 25/02/2016

There’s no one-size-fits-all framework for monitoring and evaluating research projects – but here’s a start

It was once enough to gauge the ‘success’ of a research project – even those aiming to influence policy – by counting the number of articles published in peer-reviewed journals, and possibly citations or downloads.

But the scope and scale of policy research projects has broadened – moving away from single research studies towards multi-component, multi-partner and multi-sector endeavours. These projects often have numerous aims, beyond producing high-quality publications: to have impact beyond academia, engage different external stakeholders and/or build partners’ capacity, to name just a few examples.

Evaluating outputs is no longer enough to assess a project’s impact as it only captures a small proportion of what these policy research projects aim to achieve.

Designing a monitoring and evaluation (M&E) framework for these large-scale policy research projects can often be a challenge – and it’s this vital planning that the Research and Policy in Development (RAPID) team of the Overseas Development Institute is often asked to help with.

Based on years of experience working with NGOs, policy-makers and other research institutes, and building on previous RAPID work, The Methods Lab have put together new guidance on how to design M&E frameworks for complicated and complex projects.

Here, we share four key things to bear in mind as you embark on designing an M&E framework for your policy research project:

1. Go beyond outputs and uptake

Outputs – and to some extent uptake – are usually well captured in research M&E frameworks. But thinking through diverse ways that uptake, outcomes and ‘impact beyond academia’ can take place and how context can be monitored is often more challenging.

The framework RAPID has been using for almost a decade (based on the five-level ‘Ingie Framework’, with the addition of a sixth level – ‘context’) tracks changes more closely and pays attention to the often neglected elements of strategy and management.

Though overlooked, these two levels are hugely important and influential – especially in big, multi-year, multi-million research programmes. Strategy and management are usually assessed intuitively by managers but it is worth systematically and regularly (e.g. annually) reflecting and assessing whether the project direction is still valid and whether applied decision-making and governance systems are functioning and fit-for-purpose.

2. Start with questions and prioritise

As with any good evaluation plan, the design of an M&E framework should start with purposes and questions, not with methods, indicators or logframes. Beginning with indicators can lock you into what will be measured (indicators) before thinking through what you want to know (M&E questions) and why (purposes).

Questions have the power to help direct ‘sense-making’ and can support programme design. It is also better to prioritise and focus on one or two main M&E questions, and then support these with secondary questions that don’t all necessarily need to be answered every year or at each assessment point.

3. There’s no one-size-fits-all solution (sorry)

Though many of the policy research projects and programmes we work with have similar characteristics, their nature and the context in which they operate in differ significantly. Even within RAPID we all use the M&E framework in diverse ways, highlighting different elements and steps.

Our guidance note doesn’t attempt to provide comprehensive instruction on all aspects of developing an M&E system (for example, how to collect, manage, analyse and use data); instead it should be viewed, not as set-in-stone, but as flexible guidance to start and structure thinking.

4. Be realistic with M&E plans and activities

Ultimately, many of the choices about deciding the scope, intensity and timing of the M&E areas will largely depend on the resources available – personnel, time and funds as well as capacity, experience and skills of those people dedicated to, and involved in, the M&E work.

It is better to be realistic and practical about what can be done and how much time people can truly spend on M&E activities, rather than trying to do everything possible but in a hasty or unsystematic manner.

This post was originally published here.

 

Related posts

What role for research when ordinary life is put on hold? - 29/11/2024
Africa’s use of evidence: challenges and opportunities - 02/09/2024
Nothing about us without us - 23/08/2024

Get 'New Post' e-alerts and follow R2A

> > > > >

Contribute to R2A:
We welcome blogposts, news about jobs, events or funding, and recommendations for great resources about development communications and research uptake.

Tiina Pasanen

Tiina Pasanen is a researcher in the Research and Policy in Development (RAPID) programme at the Overseas Development Institute. Based on years of experience working with NGOs, policy-makers and other research institutes, and building heavily on previous RAPID work, The Methods Lab have put together new guidance on how to design M&E frameworks for complicated and complex projects.

Contribute Write a blog post, post a job or event, recommend a resource

Partner with Us Are you an institution looking to increase your impact?

Most Recent Posts

  • “No stories without data, no data without stories”: A framework for showcasing researcher impact
  • Reshaping Africa’s evidence ecosystem
  • R2A Recommends: The Politics of Funding and Evidence Use
  • R2A Recommends: ALNAP’s updated OECD criteria for humanitarian evaluation
  • A recipe for change: The Cookbook for Youth-Led Accountability
In our latest blog Inés Arangüena breaks down the Researcher Impact Framework (RIF) — a super practical way to show the real-world impact behind research, not just the publication count.

✨ Why it’s worth a read:
💡 Clearly explain the difference your work makes
📚 Connect outcomes to real evidence + activity
🤝 Highlight impact through collaboration, knowledge sharing & community
📊 Use metrics that actually matter (not just journal impact factors!)

If you’re a researcher, communicator, or anyone trying to share the story behind your work… this framework is a game-changer.

🔗 Link in bio to read the full blog!

#ResearchImpact #KnowledgeTranslation #DataDrivenStorytelling #AcademicCommunications #ImpactNarratives Trinity College Dublin

In our latest blog Inés Arangüena breaks down the Researcher Impact Framework (RIF) — a super practical way to show the real-world impact behind research, not just the publication count.

✨ Why it’s worth a read:
💡 Clearly explain the difference your work makes
📚 Connect outcomes to real evidence + activity
🤝 Highlight impact through collaboration, knowledge sharing & community
📊 Use metrics that actually matter (not just journal impact factors!)

If you’re a researcher, communicator, or anyone trying to share the story behind your work… this framework is a game-changer.

🔗 Link in bio to read the full blog!

#ResearchImpact #KnowledgeTranslation #DataDrivenStorytelling #AcademicCommunications #ImpactNarratives Trinity College Dublin

Revisiting a 2022 article by Tebby Leepile this International Week of Science and Peace. It dives into the challenge of scaling implementation science: too big becomes unsustainable, too small makes little impact. 🌍🔬

How do we find the balance that leads to real change?

Full article in linktree just click #R2AArchive 🔗

#ScienceForPeace #InternationalWeekOfScienceAndPeace #ImplementationScience #SustainableDevelopment #ScaleUpImpact  #FromDataToImpact  #InnovationForGood

Revisiting a 2022 article by Tebby Leepile this International Week of Science and Peace. It dives into the challenge of scaling implementation science: too big becomes unsustainable, too small makes little impact. 🌍🔬

How do we find the balance that leads to real change?

Full article in linktree just click #R2AArchive 🔗

#ScienceForPeace #InternationalWeekOfScienceAndPeace #ImplementationScience #SustainableDevelopment #ScaleUpImpact #FromDataToImpact #InnovationForGood

✨ This week #R2ARecommends a powerful new guide from ALNAP — updating how we evaluate what really matters in humanitarian action. 🌍

The guide refreshes definitions, clears up old ambiguities, and introduces new priority themes — making evaluation frameworks more relevant, inclusive, and climate-aware for today’s humanitarian challenges. 💪🏽

As always check out our linktree to read the full article 🔗

#HumanitarianEvaluation #ALNAP #OECDDAC #LocallyLedAction #PeopleCentredEvaluation #AccountabilityToAffectedPeople #SustainableHumanitarianAction #EvidenceForAction #GlobalDevelopment #R2ARecommends #EvaluationMatters #HumanitarianLearning

✨ This week #R2ARecommends a powerful new guide from ALNAP — updating how we evaluate what really matters in humanitarian action. 🌍

The guide refreshes definitions, clears up old ambiguities, and introduces new priority themes — making evaluation frameworks more relevant, inclusive, and climate-aware for today’s humanitarian challenges. 💪🏽

As always check out our linktree to read the full article 🔗

#HumanitarianEvaluation #ALNAP #OECDDAC #LocallyLedAction #PeopleCentredEvaluation #AccountabilityToAffectedPeople #SustainableHumanitarianAction #EvidenceForAction #GlobalDevelopment #R2ARecommends #EvaluationMatters #HumanitarianLearning


About Us

Research To Action (R2A) is a learning platform for anyone interested in maximising the impact of research and capturing evidence of impact.

The site publishes practical resources on a range of topics including research uptake, communications, policy influence and monitoring and evaluation. It captures the experiences of practitioners and researchers working on these topics and facilitates conversations between this global community through a range of social media platforms.

R2A is produced by a small editorial team, led by CommsConsult. We welcome suggestions for and contributions to the site.

  • Home
  • About Us
  • Cookies
  • Contribute

Subscribe to our newsletter!

Our contributors

  • Paula Fray
  • Shubha Jayaram
  • Sue Martin
  • Maria Balarin
  • James Harvey
  • Emily Hayter
  • Susan Koshy
  • Ronald Munatsi
  • Ajoy Datta

Browse all authors

Friends and partners

  • AuthorAid
  • Global Development Network (GDN)
  • INASP
  • Institute of Development Studies (IDS)
  • International Initiative for Impact Evaluation (3ie)
  • ODI RAPID
  • On Think Tanks
  • Politics & Ideas
  • Research for Development (R4D)
  • Research Impact

Copyright © 2025 Research to Action. All rights reserved. Log in