Altmetric.com –TOP 100 in 2017
Altmetric.com has published a list of the TOP 100 articles that generated the most attention in 2017 according to the quantitative indicators calculated based on mentions in the news, social networks, Wikipedia and other sources tracked by Altmetric.com. The overview of the TOP 100 articles in the previous years (beginning with 2013) can also be found on the Altmetric.com website.
It is immediately apparent that articles in medical sciences prevail among the TOP 100, as well as that this edition of the list does not include any articles in humanities. The most represented journal is Nature with as many as 16 articles. Interestingly, a little over half the articles (51) were published in subscription-based journals, as opposed to 31 Open Access and 18 free-to-read articles.
Soon after the TOP 100 articles had been announced, a comment appeared on Scholarly Kitchen drawing attention to the questionable aspects of the methodology used by Altmetric.com. These namely include an arbitrary evaluation of the different sources of information and the prevailing reliance on news outlets and Twitter and the lack of data from important sources that cannot be tracked by Altmetric.com – in most cases, due to the lack of access to data (e.g. ResearchGate, LinkedIn).
Based on data collected from the tracked sources, Altmetric.com generates an aggregate quantitative indicator– Altmetric Attention Score (AAS). A higher score is assumed to indicate greater attention attracted on the Internet. Mentions in different sources are weighted differently (e.g. mentions in news outlets are weighted by 8, Wikipedia citations by 3, mentions on Twitter by 1, etc.).
The limited geographical coverage is also an issue: Almetric.com covers only news outlets in English. Due to this, various content types created by or associated with academic communities beyond the Anglo-Saxon cultural circle have considerably poorer chances of reaching a high Altmetric Attention Score.
This also applies to Serbia, where the situation is even more complicated because a significant number of journal articles are not assigned DOI, due to which they cannot be tracked by Altmetric.com and similar services. It is often difficult to find any altmetric data even for the articles (from Serbia and other similar countries) that can be tracked. When it is possible to calculate an AAS, it is rarely greater than 10 and is most often based solely on Wikipedia citations. A look at the series of small numbers reveals something absurd: in the Altmetric.com world, articles uploaded (possibly by their authors) on Scribd are treated as mentions in news outlets and are “more valued” than those cited in Wikipedia or in a policy document.
The information provided by Altmetric.com and other similar services should be taken with a grain of salt, bearing in mind that the data collection methodology and the score structure are open to debate. At the same time, there is an indisputable need for tools that would cover areas beyond traditional bibliometric analysis and it is reasonable to expect that they will keep on changing and improving. It is up to the academic community to be aware of them, to get involved in their development, whenever possible, through comments and research and to help define their place in the process of tracking the impact of research in the community.