Measuring research’s impact is notoriously difficult as its effects on policy are hard to substantiate and partly due to the dominance of peer review as the ‘golden standard’. Using simple tools to track citations is one way of tangibly measuring the impact of research. Citations are useful not only because they give statistics and calculated indexes, they also show the areas where future marketing can be focused to maximise the impact of research.
The RAE did not define impact but gave a broad definition of four star research as ‘quality that is world-leading in terms of originality, significance and rigour. This standard will be achieved by a research output that is, or is likely to become, a primary reference point of the field or subfield.’ The new REF specifically mentions impact as a research criterion and defines it as ‘an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia.’ A blog on the LSE site explores the issues surrounding the new impact focus and a further article discusses the shift in attention to external sources of impact outside university circles.
A recently published handbook by the Impact of Social Sciences Group based at LSE defines impact as ‘a recorded or otherwise auditable account of influence from academic research on another act or organisation’, taking a more basic understanding of the word. In a nutshell, the handbook assesses three types of web based searches giving a comprehensive account of why citations are useful and how to use three different types of web based tools:
- ISI Web of Knowledge or Scopus
- Google Scholar or Book Search (and Scirus)
- Harzing’s Publish or Perish (HPoP)
The paper reaches the conclusion that HPoP is the best method to track citations. It does so by eliminating ISI Web of Knowledge and Scopus as not inclusive enough as they focus too heavily on solely American, academic journals whilst excluding books. The Google resources on the other hand publish grey or black material and include more recent working papers so remedying the time delay and narrow scope of the more traditional bibliometric systems. However, HPoP is still preferable as it improves upon the Google search method, without the secret methods and duplications making it more reliable. It also gives the statistics of citations in a year as well as the overall citation count. This blog fully explains how to work HPoP.
For social scientists the traditional bibliometric methods appear to be less useful. This is because work is generally less ‘self cited’ so citation counts will be lower possibly due to a lack of ‘systematic reviewing’, more work is generally published in books and less attention is paid to purely academic journals. This idea is fully explained in another post on the LSE site.
The heavy bias of the bibliometrics towards the American sphere also presents a problem for the Southern academic. A quick search of well known researchers based in the UK compared to Africa on Scopus will yield informative results with fewer cites or acknowledged papers for the Southern academics; a problem that is particularly pertinent for those working in development. Another issue is that of media openness. Google Scholar and Book Searches are ways to get round the pays walls and exclusivity of academic journals however their publication of grey material is labelled unreliable. Most books are not fully electronically available which only partially solves the problem of access from less developed areas.
By Laura Ffrench-Constant
For a more up to date take on impact and very interesting data on southern academics take a look at this excellent article: Uncovering the world’s ‘unseen’ science, by Caroline Wagner from SciDev