Corpus ID: 42628

Adapting Binary Information Retrieval Evaluation Metrics for Segment-based Retrieval Tasks

@article{Aly2013AdaptingBI,
  title={Adapting Binary Information Retrieval Evaluation Metrics for Segment-based Retrieval Tasks},
  author={R. Aly and Maria Eskevich and R. Ordelman and G. Jones},
  journal={ArXiv},
  year={2013},
  volume={abs/1312.1913}
}
  • R. Aly, Maria Eskevich, +1 author G. Jones
  • Published 2013
  • Computer Science
  • ArXiv
  • This report describes metrics for the evaluation of the effectiveness of segment-based retrieval based on existing binary information retrieval metrics. This metrics are described in the context of a task for the hyperlinking of video segments. This evaluation approach re-uses existing evaluation measures from the standard Cranfield evaluation paradigm. Our adaptation approach can in principle be used with any kind of effectiveness measure that uses binary relevance, and for other segment-baed… CONTINUE READING
    24 Citations

    Figures and Topics from this paper.

    CUNI at TRECVID 2015: Video Hyperlinking Task
    • 2
    • Highly Influenced
    • PDF
    Context in Video Search: Is Close-by Good Enough When Using Linking?
    • 5
    • Highly Influenced
    DCLab at MediaEval2014 Search and Hyperlinking Task
    • 5
    • PDF
    DCU ADAPT @ TRECVid 2015: Video Hyperlinking Task
    • 3
    • Highly Influenced
    • PDF
    Audio Information for Hyperlinking of TV Content
    • 4
    • Highly Influenced
    Effective video hyperlinking by means of enriched feature sets and monomodal query combinations

    References

    SHOWING 1-6 OF 6 REFERENCES
    New Metrics for Meaningful Evaluation of Informally Structured Speech Retrieval
    • 24
    • PDF
    Do user preferences and evaluation measures line up?
    • 128
    • PDF
    I Come Not To Bury Cranfield, but to Praise It | NIST
    • 8
    Trec eval 2 Software available at http://trec.nist.gov/trec_eval
    • Trec eval 2 Software available at http://trec.nist.gov/trec_eval
    • 2006