Corpus ID: 219530980

DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations

@article{Giorgi2020DeCLUTRDC,
  title={DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations},
  author={John Michael Giorgi and Osvald Nitski and Gary D Bader and Bo Wang},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.03659}
}
We present DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations, a self-supervised method for learning universal sentence embeddings that transfer to a wide variety of natural language processing (NLP) tasks. Our objective leverages recent advances in deep metric learning (DML) and has the advantage of being conceptually simple and easy to implement, requiring no specialized architectures or labelled training data. We demonstrate that our objective can be used to pretrain… Expand
21 Citations

Figures and Tables from this paper

Data-Efficient Language-Supervised Zero-Shot Learning with Self-Distillation
CLEAR: Contrastive Learning for Sentence Representation
Bi-Granularity Contrastive Learning for Post-Training in Few-Shot Scene
Data Efficient Language-Supervised Zero-Shot Recognition with Optimal Transport Distillation
  • Rui Cheng, Joseph E. Gonzalez, Peihua Zhang, Hongqing Cheng
  • 2021
TSDAE: Using Transformer-based Sequential Denoising Auto-Encoder for Unsupervised Sentence Embedding Learning
Capturing scientific knowledge in computable form
ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer
Contrastive Conditional Transport for Representation Learning
  • Huangjie Zheng, Xu Chen, +7 authors Mingyuan Zhou
  • Computer Science, Mathematics
  • ArXiv
  • 2021
Contrastive Pre-training for Imbalanced Corporate Credit Ratings
Disentangled Contrastive Learning for Learning Robust Textual Representations
  • Xiang Chen, Xin Xie, +5 authors Huajun Chen
  • Computer Science
  • ArXiv
  • 2021
...
1
2
3
...

References

SHOWING 1-10 OF 69 REFERENCES
Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
SentEval: An Evaluation Toolkit for Universal Sentence Representations
Cross-lingual Language Model Pretraining
Reducing BERT Pre-Training Time from 3 Days to 76 Minutes
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Deep Residual Learning for Image Recognition
...
1
2
3
4
5
...