Multileaved Comparisons for Fast Online Evaluation

  title={Multileaved Comparisons for Fast Online Evaluation},
  author={Anne Schuth and Floor Sietsma and Shimon Whiteson and Damien Lefortier and Maarten de Rijke},
Evaluation methods for information retrieval systems come in three types: offline evaluation, using static data sets annotated for relevance by human judges; user studies, usually conducted in a lab-based setting; and online evaluation, using implicit signals such as clicks from actual users. For the latter, preferences between rankers are typically inferred from implicit signals via interleaved comparison methods, which combine a pair of rankings and display the result to the user. We propose… CONTINUE READING
Highly Cited
This paper has 47 citations. REVIEW CITATIONS


Publications citing this paper.
Showing 1-10 of 33 extracted citations


Publications referenced by this paper.
Showing 1-5 of 5 references

Test collection based evaluation of information retrieval systems

  • M. Sanderson
  • Found. & Tr. Inform. Retr.,
  • 2010
Highly Influential
4 Excerpts

Similar Papers

Loading similar papers…