Supervised Contrastive Learning Approach for Contextual Ranking
@article{Anand2022SupervisedCL, title={Supervised Contrastive Learning Approach for Contextual Ranking}, author={Abhijit Anand and Jurek Leonhardt and Koustav Rudra and Avishek Anand}, journal={Proceedings of the 2022 ACM SIGIR International Conference on Theory of Information Retrieval}, year={2022} }
Contextual ranking models have delivered impressive performance improvements over classical models in the document ranking task. However, these highly over-parameterized models tend to be data-hungry and require large amounts of data even for fine tuning. This paper proposes a simple yet effective method to improve ranking performance on smaller datasets using supervised contrastive learning for the document ranking problem. We perform data augmentation by creating training data using parts of…
References
SHOWING 1-10 OF 77 REFERENCES
Learnt Sparsity for Effective and Interpretable Document Ranking
- Computer ScienceArXiv
- 2021
This paper introduces the select and rank paradigm for document ranking, where interpretability is explicitly ensured when scoring longer documents, and treats sentence selection as a latent variable trained jointly with the ranker from the final output.
Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning
- Computer ScienceICLR
- 2021
This work proposes a supervised contrastive learning (SCL) objective for the fine-tuning stage of natural language understanding classification models and demonstrates that the new objective leads to models that are more robust to different levels of noise in the training data, and can generalize better to related tasks with limited labeled task data.
Contextualized Word Representations for Document Re-Ranking
- Computer Science
- 2019
This work investigates how two pretrained contextualized language modes (ELMo and BERT) can be utilized for ad-hoc document ranking and proposes a joint approach that incorporates BERT's classification vector into existing neural models and shows that it outperforms state-of-the-art ad-Hoc ranking baselines.
Neural Retrieval for Question Answering with Cross-Attention Supervised Data Augmentation
- Computer ScienceACL
- 2021
A supervised data mining method using an accurate early fusion model to improve the training of an efficient late fusion retrieval model and the resulting retrieval model with additional data significantly outperforms retrieval models directly trained with gold annotations.
Distant Supervision in BERT-based Adhoc Document Retrieval
- Computer ScienceCIKM
- 2020
A weak-supervision based transfer passage labelling scheme that helps in performance improvement and gathering relevant passages from unlabelled documents and introduces passage level weak supervision in contrast to standard document level supervision.
Supervised Contrastive Learning
- Computer ScienceNeurIPS
- 2020
A novel training methodology that consistently outperforms cross entropy on supervised learning tasks across different architectures and data augmentations is proposed, and the batch contrastive loss is modified, which has recently been shown to be very effective at learning powerful representations in the self-supervised setting.
xMoCo: Cross Momentum Contrastive Learning for Open-Domain Question Answering
- Computer ScienceACL
- 2021
This paper proposes a new contrastive learning method called Cross Momentum Contrastive learning (xMoCo), for learning a dual-encoder model for question-passage matching that efficiently maintains a large pool of negative samples like the original MoCo and enables using separate encoders for questions and passages.
Pre-training Tasks for Embedding-based Large-scale Retrieval
- Computer ScienceICLR
- 2020
It is shown that the key ingredient of learning a strong embedding-based Transformer model is the set of pre- training tasks, and with adequately designed paragraph-level pre-training tasks, the Transformer models can remarkably improve over the widely-used BM-25 as well as embedding models without Transformers.
A Survey on Contrastive Self-supervised Learning
- Computer ScienceTechnologies
- 2020
This paper provides an extensive review of self-supervised methods that follow the contrastive approach, explaining commonly used pretext tasks in a contrastive learning setup, followed by different architectures that have been proposed so far.
More Robust Dense Retrieval with Contrastive Dual Learning
- Computer ScienceICTIR
- 2021
This paper analyzes the embedding space distributions and proposes an effective training paradigm, Contrastive Dual Learning for Approximate Nearest Neighbor (DANCE) to learn fine-grained query representations for dense retrieval.