Research to Action

The Global Guide to Research Impact

Navigation

  • Home

  • How To ▾

    This list of how to’s provides an essential guide for a number of key communication and engagement activities that will help make your research travel.

    • Building Capacity
    • Policy Briefs
    • Research Impact
    • Theory of Change
    • Uptake Strategy
  • Topics ▾

    • Eye on 2022
    • Impact Practitioners ▸
      • Impact Practitioners
    • Knowing your audience ▸
      • Building a strategy
      • Engaging policy audiences ▸
        • EBPDN
        • Targeting policy actors
        • Targeting practitioners
      • Stakeholder mapping
      • Strategic communication ▸
        • Building a brand
        • Engaging the public
      • Working with the media
    • Making your research accessible ▸
      • Framing challenges
      • Knowledge translation
      • Learning in context
      • Open access
      • Presenting your research
      • Using digital tools ▸
        • Using multi media
        • Using online tools/ICTs
        • Using social media
      • Using intermediaries
    • Monitoring and evaluation ▸
      • Applying M&E methods
      • Evidence into policy
      • Measuring success
    • Uncategorized
  • Dialogue Spaces ▾

    • GDN: Doing Research
    • Manchester Policy Week 2015
    • TTI Exchange 2015
    • Strengthening Institutions to Improve Public Expenditure Accountability (GDN PEM Project)
    • DFID/AusAid Research Communication and Uptake Workshop
    • 3ie Policy Influence and Monitoring (PIM) project
    • Policy Engagement and Communications (PEC) Programme
  • Reading Lists

  • Impact Practitioners

    • Impact Practitioners overview
    • Capacity Building
    • Communication and Engagement
    • Frameworks
    • Indicators
    • Learning
    • Monitoring and Evaluation
    • Policy Impact
    • Strategy
    • Theoretical
    • Utilisation

Social Media

  • Facebook
  • Twitter
  • Instagram
  • LinkedIn
  • Vimeo

Featured

Is understanding influence the solution to the impact puzzle?

By James Harvey 28/01/2013

The question is “how effective are “indicators” as a yardstick of impact?” I’ve been troubled by this question for some time. It is highly feasible that the impact of a project or research programme may not be evident until many, many years after the initiative has been wrapped up. In fact many impacts may not be immediately obvious or directly attributable to the project at all. Some impacts may remain invisible until large-scale qualitative research is commissioned, and the likelihood of this happening is very much hostage to budgetary constraints and what is feasible across the overall project lifespan.

It is for this reason that there needs to be a nuanced debate on impact that releases us from the constraints of our logframes and opens the door to a more holistic, long-term perspective of impact that involves a deeper comprehension of the role of influence.

UK academics will now be all too familiar with the impending Research Excellence Framework (REF) exercise that adds an impact component to the grading of research excellence across all participating higher education institutions (HEIs) in the UK. The addition of the impact component to the REF represents a refinement of research assessment and a new focus on the public goods that emerge from research. The inclusion of a 20% weighting for impact in the REF has created a vigorous debate across the academic community. One of the problems that continues to crop up in these debates is that no one is completely sure what impact exactly is or, indeed, how one goes about measuring it.

We know that impact, as a concept, is clearly important but it appears that everyone has their own conception. This is inevitably of great significance to how practitioners of all walks of life deal with the data, metrics and indicators that are used to monitor and evaluate projects. The issue is not what data there is, but how it is rationalised given how widely opinion (and cognition) can differ on what impact looks like. The same problem presents itself when trying to measure “success” – everyone has a different opinion on what the indicators for that would look like.

But our differing perspectives could actually be key to solving the impact riddle. The more perspectives we have, the greater our understanding will be. However, we need to recognise the limitations of our datasets and indicators as they may narrow our focus and divert our attention from the bigger picture. In other words, whilst indicators and metrics are absolutely key as an evidence or performance base, they tend to reinforce some perspectives at the expense of others that are equally valid for our understanding of the interrelatedness of effects. This is because effects and outcomes will not always be measurable or even visible – a dynamic responsible for throwing more than one logical framework into disarray.

The central issue here is that we need to rediscover our relationship with impact and dream anew. We need to step back from the quantitative and consider, however briefly, the possibility that impact is a phenomenon with a far longer timeline (or lifespan) than can be encompassed by indicators and metrics alone. Should we even attempt to audit impact in any definitive sense without first subjecting our performance indicators to scrutiny?

Understanding effects, outcomes and their linkages requires an open-minded, critical thinking about the relationship between effects and outcomes. Influence as an effect can be part of a broader, long-term view of a situation but it is difficult to measure no matter how brilliant one’s data is. In terms of its significance to impact, influence is also perceived differently by different audiences – the word itself may not be understood by all. The debates on impact that arose in the run-up to REF 2014 evidence just how slippery a concept can be in this regard.

Boiled down to its lowest terms, impact is an outcome. A result of an activity rather than an activity in itself. This may sound obvious but the subtleties of the impact-influence nexus are not always plainly visible or easily understood. Whilst we may feel that it is adequate to frame impact in terms of direct or indirect effect, this often fails to account for how influence has (or continues to be) achieved, how a project or programme is interpreted and understood by audiences, beneficiaries, and practitioners through varying lenses of culture, politics, economics, and the traditions they contain.

Achieving influence is therefore a sophisticated craft and one that has not been adequately framed by the ongoing debate surrounding impact. Without an understanding of its relationship to outcomes, no framework, indicators or metrics give us the full picture. Although physical results are essential for evaluating success, we should avoid becoming preoccupied by them.

Related posts

EBPDN: Refreshing recommended resources - 31/10/2019
Building momentum to advance citizen evidence in policymaking - 03/09/2019
Bringing researchers and knowledge brokers together for greater impact - 29/05/2019

Get 'New Post' e-alerts and follow R2A

> > > > >

Contribute to R2A:
We welcome blogposts, news about jobs, events or funding, and recommendations for great resources about development communications and research uptake.

Topics: ref, Research Excellence Framework, research impact

James Harvey

James Harvey is the communications manager at the Forest Peoples Programme (FPP). FPP works with indigenous communities to create legal and political spaces for them to secure their rights, control their lands and decide their own futures. Free, prior and informed consent is the foundation upon which all of FPP's work is built.

Contribute Write a blog post, post a job or event, recommend a resource

Partner with Us Are you an institution looking to increase your impact?

Tweets by @Research2Action

Most Recent Posts

  •  How to develop input, activity, output, outcome and impact indicators 
  • Free webinar: Beyond content delivery: How to make online learning truly participatory
  • UNV Social and Behaviour Change (SBC) Officer, UNICEF: Ethiopia – Deadline 29 March
  • How research influences policy: case studies from Australia
  • Network for Advancing & Evaluating the Societal Impact of Science

This Week's Most Read

  • How to write actionable policy recommendations
  • What do we mean by ‘impact’?
  • Policymaker, policy maker, or policy-maker?
  • Outcome Mapping: A Basic Introduction
  • Gap analysis for literature reviews and advancing useful knowledge
  • 12ft Ladder: Making research accessible
  • Stakeholder Analysis: A basic introduction
  • Key questions to ask when putting together a Theory of Change for Research Uptake (Part 1 of 2)
  • Top tips: Writing newspaper opinion pieces
  • Decolonising research: Some useful strategies

About Us

Research To Action (R2A) is a learning platform for anyone interested in maximising the impact of research and capturing evidence of impact.

The site publishes practical resources on a range of topics including research uptake, communications, policy influence and monitoring and evaluation. It captures the experiences of practitioners and researchers working on these topics and facilitates conversations between this global community through a range of social media platforms.

R2A is produced by a small editorial team, led by CommsConsult. We welcome suggestions for and contributions to the site.

  • Home
  • About Us
  • Cookies
  • Contribute

Subscribe to our newsletter!

Our contributors

  • Paula Fray
  • Shubha Jayaram
  • Sue Martin
  • Maria Balarin
  • James Harvey
  • Emily Hayter
  • Susan Koshy
  • Ronald Munatsi
  • Ajoy Datta

Browse all authors

Friends and partners

  • AuthorAid
  • Global Development Network (GDN)
  • INASP
  • Institute of Development Studies (IDS)
  • International Initiative for Impact Evaluation (3ie)
  • ODI RAPID
  • On Think Tanks
  • Politics & Ideas
  • Research for Development (R4D)
  • Research Impact

Copyright © 2023 Research to Action. All rights reserved. Log in