Bibliometrics: The Leiden Manifesto for research metrics

@article{Hicks2015BibliometricsTL,
  title={Bibliometrics: The Leiden Manifesto for research metrics},
  author={Diana Hicks and Paul Wouters and Ludo Waltman and Sarah de Rijcke and Ismael Rafols},
  journal={Nature},
  year={2015},
  volume={520},
  pages={429-431}
}
Nutzen Sie diese zehn Grundsatze um Forschung zu bewerten, drangen Diana Hicks, Paul Wouters und Kollegen. 

23rd Nordic Workshop on Bibliometrics and Research Policy 2018 Book of abstracts

Initiated by Professors Olle Persson and Peter Ingwersen, bibliometric researchers in the Nordic countries have arranged annual Nordic workshops on bibliometrics since 1996. The general scope of th

A Review of the Citation Indicators of the Ingeniería e Investigación Journal

TLDR
This paper shows the readers of the Ingeniería e Investigación journal how the indicators in the citation databases have evolved in three of them.

Rethinking impact factors: better ways to judge a journal

We need a broader, more-transparent suite of metrics to improve science publishing, say Paul Wouters, colleagues and co-signatories.We need a broader, more-transparent suite of metrics to improve

Tendencies on Traditional Metrics.

TLDR
This paper represents a review of the evolution of traditional metrics based on citations of the journals, articles and respectively the researchers.

Are all paper citations equal?

TLDR
A couple of years ago my colleague Jan Anne Annema told me that he checked why people cited the authors' paper on experiences with the use of Cost–Benefit Analysis (CBA) in the Neth...

How to counter undeserving authorship

The average number of authors listed on contributions to scientific journals has increased considerably over time. While this may be accounted for by the increased complexity of much research and a

The Journal Impact Factor Should Not Be Discarded

TLDR
This opinion piece argues that the JIF should not be demonized and still can be employed for research evaluation purposes by carefully considering the context and academic environment.

Metrics and Rankings: Myths and Fallacies

TLDR
This paper provides an introduction to the field of Bibliometrics, briefly describing its beginning and its evolution, and categorizing metrics according to their entity scope: metrics for journals, conferences and authors.

ResearchGate como fuente de evaluación científica: desvelando sus aplicaciones bibliométricas

TLDR
The objective of this work is to discuss the main advantages and disadvantages of these indicators paying special attention to the RG Score, the ResearchGate’s flagship indicator, finding that it does not measure the prestige of researchers, but instead their level of participation in the platform.
...

References

SHOWING 1-10 OF 22 REFERENCES

Which h-index? — A comparison of WoS, Scopus and Google Scholar

This paper compares the h-indices of a list of highly-cited Israeli researchers based on citations counts retrieved from the Web of Science, Scopus and Google Scholar respectively. In several case

Beyond bibliometrics : harnessing multidimensional indicators of scholarly impact

Bibliometrics has moved well beyond the mere tracking of bibliographic citations. The web enables new ways to measure scholarly productivity and impact, making available tools and data that can

Reception of Spanish sociology by domestic and foreign audiences differs and has consequences for evaluation

In this article, we compare the reception of Spanish sociology by domestic and international audiences using citation counts as an indicator of audience interest. We compare papers highly cited in a

An index to quantify an individual’s scientific research output that takes into account the effect of multiple coauthorship

I propose the index $$\hbar$$ (“hbar”), defined as the number of papers of an individual that have citation count larger than or equal to the $$\hbar$$ of all coauthors of each paper, as a useful

Why the impact factor of journals should not be used for evaluating research

TLDR
Alternative methods for evaluating research are being sought, such as citation rates and journal impact factors, which seem to be quantitative and objective indicators directly related to published science.

The outflow of academic papers from China: why is it happening and can it be stemmed?

Helped by deepening reform and openness, as well as increased overall national strength, science and technology in China have developed rapidly in recent years. The number of published scientific

The history and meaning of the journal impact factor.

TLDR
The journal impact factor was created to help select additional source journals and is based on the number of citations in the current year to items published in the previous 2 years, which allows for the inclusion of many small but influential journals.

The Leiden ranking 2011/2012: Data collection, indicators, and interpretation

TLDR
The Leiden Ranking is compared with other global university rankings, in particular the Academic Ranking of World Universities (commonly known as the Shanghai Ranking) and the Times Higher Education World University Rankings, and the comparison focuses on the methodological choices underlying the different rankings.