Overview of the TREC 2010 Relevance Feedback Track ( Notebook )

@inproceedings{Buckley2010OverviewOT,
  title={Overview of the TREC 2010 Relevance Feedback Track ( Notebook )},
  author={Chris Buckley and Matthew Lease and Mark D. Smucker},
  year={2010}
}
This year the relevance feedback track further examined relevance feedback with a single document relevance feedback task. Seven groups participated in the track. At this time, relevance judging is on-going and no results are available. This notebook version of the track describes the track, presents the current status of the track, and includes participant summaries. 

Citations

Publications citing this paper.
SHOWING 1-10 OF 36 CITATIONS

Managing the quality of crowdsourced databases

VIEW 8 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Genealogical Search Analysis Using Crowd Sourcing

  • 2012
VIEW 5 EXCERPTS
CITES BACKGROUND & RESULTS
HIGHLY INFLUENCED

Sloppiness mitigation in crowdsourcing: detecting and correcting bias for crowd scoring tasks

  • International Journal of Data Science and Analytics
  • 2018
VIEW 4 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Individual Judgments Versus Consensus: Estimating Query-URL Relevance

VIEW 3 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

References

Publications referenced by this paper.