Altmetrics (alternative metrics) are being hailed as a novel way to track the less tangible social impacts of research. Importantly too, altmetrics may provide a powerful tool to remedy the underrepresentation of Southern research in bibliometric data sets.
Sites including Altmetric and ImpactStory track impact by analysing the uptake of research within social media, in addition to using detailed citation counts and compiling the number of times an article is read or downloaded. Thus criticisms levelled at the use of traditional bibliometrics (citation counting and the h-index) such as the obsession with quantity over quality, are overcome by a more nuanced and flavoured version of impact. Altmetrics are useful in providing new insights for researchers and communications experts into the uptake of research outputs, and how and where research is being disseminated successfully. This even includes the “buzz” surrounding a research publication following its dissemination to the public via Twitter or blog posts.
Altmetrics may play a crucial role in tracking impact within the Global South due to the large and steadily growing body of work is being vastly underrepresented by traditional bibliometric data sets. With a small percentage of the global share of the production of international journals and a low percentage share of the publishing of global research, it is inevitable that Southern Research will fail to be picked up by traditional citation counts and qualify as having traditionally achieved impact. An article by Wilmmers and Trotter argues that much research remains ‘invisible’ because it is not digitally curated or in journal format and therefore is not cited, and yet, the research may have more potential impact than some journal articles.
Alternative methods of tracking impact could be central to monitoring increasingly open methods of scholarly communication, which tend to have greater prevalence in the Global South for financial reasons. SCAP the Scholarly Communication in Africa Programme is one such organisation looking at how to best improve research communication in an age of Open Access by adopting alternative metrics to track impact.
A great article by Juan Pablo Alperin ‘Ask not what Altmetrics can do for you, but what Altmetrics can do for Developing Countries,’ outlines how the practical applications of Altmetrics stretch beyond solely measuring non-scholarly output. Alperin considers altmetrics as a means of fostering academic communities and finally achieving an inclusive model of communicating research. Widespread use of altmetrics may even pressure the academic system into accepting different forms of scholarly output, helping those with scant resources, whilst also giving further emphasis to work with social relevance. This emphasis could result in a more engaged research process, and eventually an improved incentive structure for funding.
This theme is echoed by Cameron Nylon one of the co-producers of the Altmetrics Manifesto. Nylon tentatively points out that altmetrics have a greater role to play in Africa, as African research is by its nature applied and more focused on public engagement.
However, altmetrics face some initial obstacles that may hinder widespread use. There is at present no regulatory agency to centralise information and accordingly figures can vary across different sites. Moreover, different indicators of impact can gain or lose popularity overnight due to the fast paced world of social media. As Alperin points out, there remains an inherent bias within altmetrics due to social media being utilised to a greater extent by the North, necessitating care to incorporate simpler publishing technologies and databases in non-English languages to give greater representation of Southern research.
Despite these immediate shortfalls, alternative metrics remain a valuable tool that will become increasingly important and could have far reaching benefits to the Global South in advancing research visibility.
Image courtesy of [Chemistry World]
Seems like a neat idea, and it’s easy to point to the deficiencies of orthodox citation counting, but there is another perspective on this. Consider Jeffrey Beall’s comments on his Scholarly Open Access website;
“As someone who studies predatory open-access scholarly publishers, I can promise you that any system designed to measure impact at the article level will be gamed, rendering the metrics useless and invalid. For instance, there are already companies that sell Facebook likes……”
I think that the critique of gaming is very pertinent and potentially damaging, however, all metrics whether they are alternative or not are subject to some level of scrutiny. In fields or disciplines where citation counts are typically low altmetrics may provide greater scope to examine how far work travels. Gaming is certainly something which should be considered when metrics are used to bolster an argument, altmetrics are not perfect yet partly because the field is so new meaning the issues may be resolved with time. I disagree with Beall’s final comment that altmetrics lack transparency, it is perhaps because they are so widely accessible and because they can be used by anyone for free that they are being mistreated by some. Accessibility is surely a positive?