Crowdsourcing for relevance evaluation

  title={Crowdsourcing for relevance evaluation},
  author={Omar Alonso and Daniel E. Rose and Benjamin J Stewart},
  journal={SIGIR Forum},
Relevance evaluation is an essential part of the development and maintenance of information retrieval systems. Yet traditional evaluation approaches have several limitations; in particular, conducting new editorial evaluations of a search system can be very expensive. We describe a new approach to evaluation called TERC, based on the crowdsourcing paradigm, in which many online users, drawn from a large community, each performs a small evaluation task. 
Highly Influential
This paper has highly influenced 19 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 379 citations. REVIEW CITATIONS

From This Paper

Topics from this paper.
215 Citations
2 References
Similar Papers


Publications citing this paper.
Showing 1-10 of 215 extracted citations

380 Citations

Citations per Year
Semantic Scholar estimates that this publication has 380 citations based on the available data.

See our FAQ for additional information.


Publications referenced by this paper.
Showing 1-2 of 2 references

Similar Papers

Loading similar papers…