The Programme for the Enhancement of Research Information (PERI) was INASP’s flagship programme for ten years (2002-13). The programme formally ended in March 2013 but many aspects of it have been continued under new programmes, in particular the Strengthening Research and Knowledge Systems (SRKS) programme and the VakaYiko Consortium. The following interview with Alex Ademokun provides reflections on Evidence-Informed Policy Making (EIPM) through the PERI programme.
What was the PERI programme, and how is Evidence-Informed Policy Making (EIPM) considered within it?
The PERI programme was about supporting the research communication cycle, covering access, availability, dissemination and use. The EIPM element of the programme started with the premise that if research is to have a role in development then people who make development decisions should be able to access and use research evidence. The EIPM programme focused on supporting the capacity of decision makers to do this.
The PERI programme was succeeded by two new programmes – Strengthening Research and Knowledge Systems (SRKS) which started in 2012 and VakaYiko which started in 2013.
Through SRKS, INASP works with a global network of partners to strengthen access to research – wherever it’s published – and to improve the publication and visibility of research from the South. We work with librarians, publishers, IT professionals, researchers and editors to improve online access, develop research communication skills, and strengthen editorial practice
The VakaYiko consortium is the main EIPM project. VakaYiko works primarily in Ghana, South Africa and Zimbabwe supporting policy makers to recognise, articulate and act on research information needs. The consortium is led by INASP and comprises ODI, the Ghana Information Network for Knowledge Sharing (GINKS), the Human Sciences Research Council (HSRC) of South Africa and the Zimbabwe Evidence Informed Policy Network (ZEIPNET).
What are the three most important achievements of the Evidence-Informed Policy Making (EIPM)/PERi programme?
The three things that stand out for me when I review the programme are:
1. Working with institutional memory.
In developing our approaches we did not focus on building capacity at the highest level within the decision making process. We recognised that institutional memory – the potential for any improvements to stay in an organisation beyond staff turnover – meant building capacity of mid-level staff that do the day to day job of searching for information, interpreting and summarising it for policy makers. The senior policy makers are more often than not subject to changes in the political mood. If you are able to build institutional memory then it can be drawn upon even when people leave.
2. The importance of supporting individuals.
We now talk about ‘levels’ of capacity building interventions – individual, organisational or systematic. These are helpful in thinking about projects but they are of course somewhat artificial. Change, whether organisational or systematic, is driven by individuals. It is important to identify those people interested in strengthening research use and with the remit to do so and support where we can.
3. This is going to sound clichéd but it is innovation: to adapt or try things out first, be iterative and create a space to learn and build relationships.
We have worked with a lot of our partners now for a few years and it takes time to build relationships and trust. For our partners to recognise that we have something to offer. In parallel we are also learning. As a result, we hope we are better partners and are better placed to achieve more, together with the partners we support.
Can you give an example of the kind of innovation that you are talking about?
The approaches used in VakaYiko – working with the civil service centre in Ghana to develop a curriculum in skills for research uptake or running knowledge cafes to provide a space where the evidence base of key policies can be publicly discussed in Zimbabwe or testing tools that improve research use in SA – come directly from work our partners have done. It wasn’t like we at INASP went off and dreamt it up in a retreat somewhere, it came from talking to people, trying it first, seeing what happened, adapting it a bit and using it to inform the next phase of our work. That’s not a testament to our work, that’s a testament to the originality and creativity of the people we work with which is important. The approach in Ghana came out of work we did with a similar school in Tanzania. The work with knowledge cafes builds on previous work that ZEIPNET has carried out in Zimbabwe.
How are you able to sell this innovative approach to donors who are likely to be looking for tangible deliverables that are easily measured?
I think part of it is simply trust, history and experience, I suppose. INASP has done this over the last 20 years, thinking of how you build long term capacity by working with local institutions and individuals. Not trying to build new structures or bypass existing ones but instead trying to take approaches that strengthen what already exists. Over a period like that you build trust and relationships with donors and hopefully they recognise your experience and the validity of your approaches.
Evidence-informed policy making (EIPM) has grown out of the idea of evidence based policy making (EBPM). Where do you think that whole concept is going to be in 10 -15 years? Are we still going to be talking about EIPM?
We will probably call it something else, but the ideas will be the same- the underlying principle that decisions based on the best available evidence will be better decisions. As to the quality of evidence, and what constitutes evidence – that’s a philosophical conversation that you’re always going to have- but the underlying principle will be there, although we may call it something else.
It may become the new normal, something that is just accepted and pushed for, but I think we will always come back to this conversation around hierarchy of evidence, whose evidence and indigenous knowledge. They are academic issues because decisions will never be made solely on the basis of evidence but they will keep coming back, over and over again. What it means in practice, I can’t really predict. But I think that most people, when pushed, will accept the underlying principle that a decision based on the best available evidence is probably better than one that isn’t, at its simplest.
You touch on indigenous knowledge there which can be a bit of a hot potato what are your views on it?
I hesitate to be too prescriptive because I recognise that policy makers are balancing a whole host of different things: expediency, timing, priorities etc. and occasionally part of that balancing act will be indigenous knowledge. I sometimes worry that when we get into these conversations about hierarchy of evidence, we have missed the point. Someone has to make a decision. To me the real issue is whether that person has the skills to balance all those inputs – as opposed to which particular input is the most important philosophically. Where you have access to a whole range of evidence, including rigorous, reproducible scientific evidence then that is great and part of the skill I mention is being able to weigh that against all other competing factors. However you may find yourself in a situation where the only available evidence at the point of decision making is indigenous and then you have to make a judgement call. Or, in the case of climate change there is vigorous scientific evidence that human activities exacerbate changes in the climate but that then has to play into local cultures and values. This is not a north/south thing, even on an issue with overwhelming scientific evidence and potential global significance, policy making struggles with culture, values, economic pressure, global competiveness etc. in all parts of the world. You can’t wish it away. Whether you want to or not, it’s there.
It’s a lively debate and it’s one I’d like to have in a pub, but the people who are making the decisions, do they have the skills to separate those various factors? I think that is arguably more important.
I know there’s no such thing as a perfect system but can you give an example of where Evidence-Informed Policy Making (EIPM) has been seen to work or a process has been set up that has been really quite effective?
I guess the first point is what do we mean by work? By work I don’t mean using evidence will give you the conclusion you want. EIPM isn’t about reaching a decision that the scientist or the researchers or the academics think you should reach; it’s about reaching a decision where you’ve considered the research evidence and weighed it up against other factors – culture, economics, political expediency etc. I think sometimes we get naive about it, but, whether you’re in the UK or Zimbabwe, or Ghana, politicians are politicians, they have to win votes and where you are in the election cycle counts, so evidence will be viewed by policy makers with that in mind. If we can strengthening the processes that allow research to feed into decision making and allow evidence and its consequences to be weighed up that’s the best you can hope for.
In an ideal world, it should work when you have people with the skills working in an environment that supports and demands evidence. There’s a lot of talk around the capacity. To me, the capacity is one aspect, but if you’re in an environment that doesn’t maximise the capacity then it’s wasted. In an ideal world you would have a balance of people with skills, processes that demand evidence and senior policy makers that value evidence.
One criticism of EIPM is that it is normative. I don’t see that as a bad thing and we should keep working towards it.
What legacy would you like to leave with in your work?
Gosh, I’m too young for a legacy! I think one of the key things for organisations though, and I think it should be for all development organisations, is that you should want to not be needed. I think that’s very important. Or you want to adapt into something new because you’re not needed for the same things.