Toward alternative metrics of journal impact: A comparison of download and citation data

@article{Bollen2005TowardAM,
  title={Toward alternative metrics of journal impact: A comparison of download and citation data},
  author={Johan Bollen and Herbert Van de Sompel and Joan A. Smith and R. Luce},
  journal={Inf. Process. Manag.},
  year={2005},
  volume={41},
  pages={1419-1440}
}
Visualization of the citation impact environments of scientific journals: An online mapping exercise
TLDR
Aggregated journal–journal citation networks based on the Journal Citation Reports 2004 of the Science Citation Index and the Social science Citation Index are made accessible from the perspective of any of these journals.
Going beyond Citations: SERUM — a new Tool Provided by a Network of Libraries
TLDR
The goal is to provide an analytical tool called Standardized Electronic Resource Usage Metrics (SERUM) which is comparable to the Journal Citation Reports (JCR), but which makes use of download data instead of citation data.
Usage Impact Factor: the effects of sample characteristics on usage-based impact metrics
TLDR
It is observed that as the number of graduate students and faculty increases in a particular discipline, Usage Impact Factor rankings will converge more strongly with the ISI Impact Factor.
Relationship between downloads and citations at journal and paper levels, and the influence of language
TLDR
The results showed that downloads have limited utility as predictors of citation since it is in the early years when any correlations have the least significance.
Do Download Reports Reliably Measure Journal Usage? Trusting the Fox to Count Your Hens?
TLDR
It is found that controlling for number of articles, publisher, and year of download, the ratio of downloads to citations differs substantially among academic disciplines, suggesting that currently available download statistics are not sufficiently reliable to allow libraries to make subscription decisions based on price and reported downloads.
Relationship between electronic journal downloads and citations in library consortia
TLDR
The relationship between electronic journal downloads and citations and whether online electronic resource usage can be adopted as an alternative to citation for evaluation of scholarly discourse is examined to establish the relationship between download, impact factor (IF) and price.
Journal impact and proximity: An assessment using bibliographic features
TLDR
A multiple linear regression model was built to predict the journal impact factor (JIF) based on all the collected bibliographic features, which differed considerably from the clusters based on editorial board membership.
Eigenfactor: ranking and mapping scientific knowledge
TLDR
The Eigenfactor algorithm is proposed, which takes into account not only how many citations a journal receives but also where those citations come from, similar to how Google ranks web pages, but instead of ranking websites, the authors rank journals and instead of using hyperlinks, they use citations.
...
...

References

SHOWING 1-10 OF 86 REFERENCES
Mathematical relations between impact factors and average number of citations
  • L. Egghe
  • Mathematics
    Inf. Process. Manag.
  • 1988
Improving the accuracy of Institute for Scientific Information's journal impact factors
TLDR
Evidence is presented that for a considerable number of journals the values of the impact factors published in ISI's Journal Citation Reports (JCR) are inaccurate, particularly for several journals having a high impact factor.
ISI's Impact Factor as Misnomer: A Proposed New Measure to Assess Journal Impact
The purpose of this communication is to discuss a widely used measure of journal impact that is defined by the Institute of Scientific Information (ISI) and available for thousands of journals in the
Sense and nonsense about the impact factor.
  • T. Opthof
  • Education
    Cardiovascular research
  • 1997
Sense and nonsense of science citation analyses: comments on the monopoly position of ISI and citation inaccuracies. Risks of possible misuse and biased citation and impact data.
Journal editors and publishers, authors of scientific papers, research directors, university and research council administrators, and even government officials increasingly make use of so-called
Journal Evaluation: Technical and Practical Issues,
This essay provides an overview of journal evaluation indicators. It highlights the strengths and weaknesses of different indicators, together with their range of applicability. The definition of a
Open Citation Linking: The Way Forward
TLDR
The paper describes the broad scope of the Open Citation project, showing how it has progressed from early demonstrators of reference linking to produce Citebase, a Web-based citation and impact-ranked search service, and how the project has supported the development of the EPrints.org software for building OAI-compliant archives.
Results from a web impact factor crawler
TLDR
The principal findings were that with certain restrictions, WIFs can be calculated reliably, but do not correlate with accepted research rankings owing to the variety of material hosted on university servers.
Assessing the quality of scholarly journals in Linguistics:An alternative to citation-based journal impact factors
TLDR
Methods were developed to allow quality assessment of academic research in linguistics in all sub-disciplines and the potentials for application of bibliometric methods in output assessments are discussed.
The bibliometric properties of article readership information: Research Articles
TLDR
The age biases of both reads and cites are shown and two new bibliometric measures are developed which have substantially less age bias than citations: SumProd, a weighted sum of total citations and the readership rate, intended to show the total productivity of an individual; and Read10, the Readership rate for articles published in the last 10 years, intended for an individual's current productivity.
...
...