Information retrieval system evaluation: effort, sensitivity, and reliability

@inproceedings{Sanderson2005InformationRS,
  title={Information retrieval system evaluation: effort, sensitivity, and reliability},
  author={Mark Sanderson and Justin Zobel},
  booktitle={SIGIR},
  year={2005}
}
The effectiveness of information retrieval systems is measured by comparing performance on a common set of queries and documents. Significance tests are often used to evaluate the reliability of such comparisons. Previous work has examined such tests, but produced results with limited application. Other work established an alternative benchmark for significance, but the resulting test was too stringent. In this paper, we revisit the question of how such tests should be used. We find that the t… CONTINUE READING
Highly Influential
This paper has highly influenced 42 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 536 citations. REVIEW CITATIONS
257 Citations
2 References
Similar Papers

Citations

Publications citing this paper.

537 Citations

050'07'10'13'16
Citations per Year
Semantic Scholar estimates that this publication has 537 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-2 of 2 references

Similar Papers

Loading similar papers…