AboutOpen | 2022; 9: 1–2 ISSN 2465-2628 | DOI: 10.33393/ao.2022.2363 EDITORIAL |
Altmetrics, Beamplots, Plum X Metrics and friends: discovering the new waypoints in the Science Metrics roadmap
Received: December 20, 2021
Accepted: December 20, 2021
Published online: January 19, 2022
AboutOpen - ISSN 2465-2628 - www.aboutscience.eu/aboutopen
© 2022 The Authors. This article is published by AboutScience and licensed under Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0).
Commercial use is not permitted and is subject to Publisher’s permissions. Full information is available at www.aboutscience.eu
Since its beginning, the measurement of scientific publishing impact (and more recently societal impact) triggered extensive debates within the scientific community and the development of new models regularly proposed and tested. Publication impact has a huge importance, as it can directly influence a research project or the chance of obtaining research funds, as well as the professional evaluation of a scientist or his/her department, as well as his/her career.
Traditional publication metrics and citation indexes were launched long before internet became a familiar part of our lives. The Impact Factor (IF) (1) was launched in 1975 by the Institute for Scientific Information and the H-Index (2) was launched in 2005. They were followed by several models and projects, each based on different algorithms and criteria. However, the use of these initial metrics proved in time to be insufficient for a robust and qualitative evaluation of research.
In more recent times, precisely in 2012, with the aim to counteract the widespread and contested use of the IF, the DORA Declaration (3) was launched. It is signed by editors, publishers and research organizations supporting the adoption of responsible and ethical evaluation criteria (3). To strengthen the DORA Declaration, the Leiden Manifesto for research metrics (4) followed in 2015. It suggests 10 principles for the correct evaluation of scientific publications, among which the recommendation not to consider the IF alone was a main evaluation parameter.
A further drive towards innovation in metrics has been the open science movement, with its main concept of ensuring free and open access to all resources such as data and publications, in line with the principles outlined by Eu-citizen.science (5), the European online platform for sharing knowledge, tools, training and resources for citizen science.
This evolving scenario contributed to new bibliometric initiatives, like Altmetrics (6) which, as opposed to traditional indicators, generates alternative impact metrics based on the use of social media and other relevant news outlets, to the more evolutionary concept of responsible metrics, designed ‘to ensure that indicators and underlying data infrastructure develop in ways that support the diverse qualities and impacts of research’ (7).
A good example of the application of this innovative approach is the Higher Education Funding Council for England (HEFCE) (8) which has developed, as part of a broader review of the use of metrics in research evaluation, a project on the use of alternative metrics in future research iterations of excellence in the UK (9). In the Netherlands a novel approach to research evaluation is also being tested: it takes into account the societal impact, intending the ability to bring science outputs to society, and the subsequent need to measure its impact differently. Another strong position is being taken by Utrecht University: starting from 2022, the institution will evaluate researchers following the DORA Declaration principles, adopting new standards including researchers’ engagement in teamwork and their efforts to support open science (10).
In view of all the new inputs, strongly supported by the scientific community, bibliometric indexes are rapidly evolving: even the long-established IF algorithm is changing to include early access articles. This recent decision by Clarivate Analytics derives from the awareness that most journals publish new contents online first, well in advance of a complete issue (whether in print or online). How this change will affect the IF will be shown by its 2022 release.
A thorough analysis of the bibliometric scenario should also discuss the growing role of profile citations from ORCID, Researcher ID and Scopus ID: these profiles, used for evaluation through dedicated platforms like Elsevier’s Scival, which provides access to the research performance of research institutions and their associated researchers (11), and Clarivate’s Web of Science InCites, which allows to analyse institutional productivity and monitor collaboration activity (12), are also increasingly used for the evaluation of institutions, and are becoming invaluable in their benchmarking.
We believe that all the topics mentioned above need to be analysed in further detail, with the objective to widen their understanding and applicability, especially in a scenario where even the European community, through its programme Horizon Europe, is demanding more quality open access (13) and the application of evaluation criteria which measure the impact of science on society.
There is no doubt that terms like Altmetric, Clarivate’s Web of Science Author Impact Beamplots (14), PlumAnalytics Plum X Metric (15) and dozen others will become familiar to researchers and readers alike. By creating a specific section on ScienceMetrics in AboutOpen, we will try to closely monitor their developing use. We welcome contributions from the whole research community on this increasingly important field.
Disclosures
Conflict of interest: The author declares no conflict of interest.
Financial support: This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.
References
- 1. Garfield E. The history and meaning of the journal impact factor. JAMA. 2006;295:90-93. PubMed CrossRef
- 2. Hirsch JE. An index to quantify an individual’s scientific research output. Proc Natl Acad Sci USA. 2005;102(46):16569-16572. PubMed; CrossRef.
- 3. San Francisco Declaration on Research Assessment. Online (Accessed December 2021).
- 4. Hicks D, Wouters P, Waltman L, et al. Bibliometrics: The Leiden Manifesto for research metrics. Nature. 2015;520:429-431. PubMed CrossRef
- 5. Online
- 6. Priem J, Taraborelli D, Groth P, Neylon C. Altmetrics: a manifesto. 2010. Online (Accessed December 2021).
- 7. Responsible metrics. Online (Accessed December 2021).
- 8. UK Research and Innovation. Online (Accessed December 2021).
- 9. Research Excellence Framework Online (Accessed December 2021).
- 10. Woolson C. Impact factor abandoned by Dutch university in hiring and promotion decisions. Nature. 2021;462. PubMed CrossRef
- 11. Scival (Elsevier). Online (Accessed December 2021).
- 12. Clarivate Web of Science, InCites Benchmarking & Analytics. Online (Accessed December 2021).
- 13. Horizon Europe, open science. Early knowledge and data sharing, and open collaboration. CrossRef (Accessed December 2021).
- 14. Szomszor M. The Web of Science Author Impact Beamplots: A new tool for responsible research evaluation. March 2021, Online (Accessed December 2021).
- 15. Plum X Metrix. Online (Accessed December 2021).