Knowing your audience

The best of the ‘Cup of Tea’ webinar series

By 11/07/2017

Research to Action recently held its first webinar series. The 30-minute webinars aimed to unpack a particular aspect of research uptake over a cup of tea, creating an accessible and informal space for participants to learn from the perspectives and experiences of one of Research to Action’s expert contributors. The conversations covered the four traditional building blocks of research uptake: stakeholder engagement, capacity building, communications, and finally monitoring, evaluation, and learning.

Below are details of the webinar recordings, slides, and text summaries online, as well as our favourite words of wisdom from each webinar. Stay tuned for podcasts of each session on our AudioBoom account.

Episode 1 ‘An introduction to the research uptake and impact landscape’: A cup of tea with Saskia Gent, Insights for Impact

Saskia Gent surveyed the research uptake landscape and described 2009 as a pivotal year for the uptake and impact agenda, when DFID introduced the expectation that 10% of funding would be ring-fenced for uptake activities at all DFID-funded research centres. Saskia went on to give her five top tips for achieving research impact:

  1. Plan early and well
  2. Request resources in proposals
  3. Invest in relationships
  4. Be agile
  5. Be optimistic!

Saskia’s top resource recommendation was the Impact Lab, run by the Impact Initiative at the University of Sussex.

Poll results:

Episode 2 ‘Balancing competition, collaboration and impact in international developing funding calls’: A cup of tea with Yaso Kunaratnam, UKCDS

In an unusual and often ignored angle on the topic of stakeholder engagement, Yaso Kunaratnam discussed how funders of international development research can better work together and with grantees to foster research uptake and impact. Yaso’s advice to funders included:

  1. Hold back money for post-award activities
  2. Look at linking multiple calls to maximise impact
  3. Shift to a demand-led approach by scoping country needs first and being aware of the political context of the specific research topic
  4. Provide targeted support for independent knowledge brokers to look across bodies of evidence
  5. Support interdisciplinary research
  6. Provide better guidance on capacity building

You can read the full report ‘Striking the balance between competition, collaboration and impact (CCI) in international development funding calls & programmes’ authored by Yaso online.

Poll results:

Episode 3 ‘Capacity Building and the DRUSSA programme’: A cup of tea with Diana Coates, Organisation Systems Design

Diana Coates shared the lessons learned and her experiences of the five-year, DFID-funded DRUSSA programme, which aimed to build uptake capacity across Sub-Saharan African universities. Diana’s top tips for building effective capacity to enable research uptake were:

  1. Interventions must be multi-level (at the individual, organisational, and institutional level) and must look at the facilitating environment around the research.
  2. Interventions should focus on meeting the expectations of the users of research, whether they are masters’ students, the funders of the research, or the wider users of the research.
  3. Interventions should be mindful that audiences access information in different ways. Research uptake is about engagement and not merely presentation.

You can find all of the online courses, reports and publications that DRUSSA produced through the collaborating institutions on the programme website, for example, the Science Communication Introduction course run by CREST.

Poll results:

Episode 4 ‘Writing clubs in Sri Lanka’: A cup of tea with Dilshani Dissanayake, University of Colombo

Dilshani Dissanayake shared both her experiences of participating in an AuthorAid-facilitated writing club at the University of Colombo in Sri Lanka and the results of the initiative. Dilshani explained some of the challenges of communicating medical research, which included:

  1. Research communications training is inadequate and not formalised in higher education institutions.
  2. Researchers can be isolated in institutions and not exposed to training.
  3. Some researchers worry about language barriers, as English is the main language used to communicate research to the world.

Read more about how research writing clubs ensure sustainable skills development in this article on the INASP website.

Poll results:

Webinar 4 poll

Episode 5 ‘Monitoring, Evaluation and Learning at ODI RAPID’: A cup of tea with John Young, ODI RAPID

John Young spoke about the common mistakes that people make when undertaking monitoring and evaluation (M&E), and gave an overview of the M&E approach that the RAPID programme at ODI uses. John explained a number of tools that can be used to assess policy influence at different levels:

  1. Strategy and direction: log frames, Theories of Change, Impact Pathways
  2. Management: quality audits, horizontal evaluation, after-action reviews
  3. Outputs: peer review, evaluating websites, evaluating networks
  4. Uptake: impact logs, citation analysis, user surveys
  5. Outcomes and impacts: stories of change, most significant change, episode studies, performance stories
  6. Monitor the context: bellwether surveys, media monitoring, timelines

To find out more about M&E methods and approaches read the Methods Lab’s Evaluation toolkit.

Poll results: