Moreover, self-citations are not excluded in these existing rankings. Clarivate Analytics provides every year a list of the most-cited scientists of the last decade, but the scheme uses a coarse classification of science in only 21 fields, and even the latest, expanded listing includes only about 6,000 scientists ( ), i.e., less than 0.1% of the total number of people coauthoring scholarly papers. To our knowledge, there is no large-scale database that systematically ranks all the most-cited scientists in each and every scientific field to a sufficient ranking depth e.g., Google Scholar allows scientists to create their profiles and share them in public, but not all researchers have created a profile. Several different citation databases exist, many metrics are available, users mine them in different ways, self-reported data in curriculum vitae documents are often inaccurate and not professionally calculated, handling of self-citations is erratic, and comparisons between scientific fields with different citation densities are tenuous. Many other problems are of a technical nature and reflect lack of standardization and accuracy on various fronts. Some challenges relate to what citations and related metrics fundamentally mean and how they can be interpreted or misinterpreted as a measure of impact or excellence. Use of citation metrics has become widespread but is fraught with difficulties. Provenance: peer reviewed, not commissioned. Elsevier runs Scopus, which is the source of this data, and also runs Mendeley Data where the database is now stored.
![scientific quotes database scientific quotes database](https://miro.medium.com/max/1216/1*NcQRRnvZdcqbnN0lm99k2w.png)
JPAI is a member of the editorial board of PLoS Biology. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.Ĭompeting interests: The authors have declared that no competing interests exist.
![scientific quotes database scientific quotes database](https://res.cloudinary.com/springboard-images/image/upload/w_1080,c_limit,q_auto,f_auto,fl_lossy/wordpress/2019/06/quote5.jpg)
The work of JPAI is also funded by an unrestricted gift from Sue and Bob O’Donnell. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.įunding: The Meta-Research Innovation Center at Stanford (METRICS) has been funded by the Laura and John Arnold Foundation (funding to JPAI). PLoS Biol 17(8):Ĭopyright: © 2019 Ioannidis et al. Citation: Ioannidis JPA, Baas J, Klavans R, Boyack KW (2019) A standardized citation metrics author database annotated for scientific field.