Crowdsourcing for relevance evaluation

@article{Alonso2008CrowdsourcingFR,
  title={Crowdsourcing for relevance evaluation},
  author={Omar Alonso and Daniel E. Rose and Benjamin J Stewart},
  journal={SIGIR Forum},
  year={2008},
  volume={42},
  pages={9-15}
}
Relevance evaluation is an essential part of the development and maintenance of information retrieval systems. Yet traditional evaluation approaches have several limitations; in particular, conducting new editorial evaluations of a search system can be very expensive. We describe a new approach to evaluation called TERC, based on the crowdsourcing paradigm, in which many online users, drawn from a large community, each performs a small evaluation task. 
Highly Influential
This paper has highly influenced 19 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 379 citations. REVIEW CITATIONS

From This Paper

Topics from this paper.
215 Citations
2 References
Similar Papers

Citations

Publications citing this paper.
Showing 1-10 of 215 extracted citations

380 Citations

050'10'12'14'16'18
Citations per Year
Semantic Scholar estimates that this publication has 380 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-2 of 2 references

Similar Papers

Loading similar papers…