ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT

@article{Khattab2020ColBERTEA,
  title={ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT},
  author={O. Khattab and M. Zaharia},
  journal={Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval},
  year={2020}
}
  • O. Khattab, M. Zaharia
  • Published 2020
  • Computer Science
  • Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval
  • Recent progress in Natural Language Understanding (NLU) is driving fast-paced advances in Information Retrieval (IR), largely owed to fine-tuning deep language models (LMs) for document ranking. While remarkably effective, the ranking models based on these LMs increase computational cost by orders of magnitude over prior approaches, particularly as they must feed each query-document pair through a massive neural network to compute a single relevance score. To tackle this, we present ColBERT, a… CONTINUE READING
    17 Citations
    CoRT: Complementary Rankings from Transformers
    • 2
    • Highly Influenced
    • PDF
    Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
    • 2
    • Highly Influenced
    • PDF
    Pretrained Transformers for Text Ranking: BERT and Beyond
    • 2
    • PDF
    Distilling Dense Representations for Ranking using Tightly-Coupled Teachers
    Learning To Retrieve: How to Train a Dense Retrieval Model Effectively and Efficiently
    BERT-QE: Contextualized Query Expansion for Document Re-ranking
    • 2
    • PDF
    Relevance-guided Supervision for OpenQA with ColBERT
    • 1
    • PDF
    SparTerm: Learning Term-based Sparse Representation for Fast Text Retrieval
    PARADE: Passage Representation Aggregation for Document Reranking
    • 8
    • PDF

    References

    SHOWING 1-9 OF 9 REFERENCES
    Context-Aware Sentence/Passage Term Importance Estimation For First Stage Retrieval
    • 26
    • Highly Influential
    • PDF
    Multi-Stage Document Ranking with BERT
    • 37
    • Highly Influential
    • PDF
    Passage Re-ranking with BERT
    • 203
    • Highly Influential
    • PDF
    Learning to Match using Local and Distributed Representations of Text for Web Search
    • 257
    • Highly Influential
    • PDF
    Anserini: Reproducible Ranking Baselines Using Lucene
    • 84
    • Highly Influential
    • PDF
    Deep contextualized word representations
    • 4,595
    • Highly Influential
    • PDF
    Okapi at TREC-3
    • 1,542
    • Highly Influential
    • PDF
    An Updated Duet Model for Passage Re-ranking
    • 26
    • Highly Influential
    • PDF
    Document Expansion by ‹ery Prediction
    • arXiv preprint arXiv:1904.08375
    • 2019