A Survey on Contrastive Self-supervised Learning

  title={A Survey on Contrastive Self-supervised Learning},
  author={Ashish Jaiswal and A. R. Babu and Mohammad Zaki Zadeh and D. Banerjee and F. Makedon},
  • Ashish Jaiswal, A. R. Babu, +2 authors F. Makedon
  • Published 2020
  • Computer Science
  • ArXiv
  • Self-supervised learning has gained popularity because of its ability to avoid the cost of annotating large-scale datasets. It is capable of adopting self-defined pseudo labels as supervision and use the learned representations for several downstream tasks. Specifically, contrastive learning has recently become a dominant component in self-supervised learning methods for computer vision, natural language processing (NLP), and other domains. It aims at embedding augmented versions of the same… CONTINUE READING
    4 Citations
    FROST: Faster and more Robust One-shot Semi-supervised Training
    • Highly Influenced
    • PDF
    Automated system to measure Tandem Gait to assess executive functions in children
    • PDF


    Hard Negative Mixing for Contrastive Learning
    • 15
    • PDF
    What makes for good views for contrastive learning
    • 91
    • PDF
    Supervised Contrastive Learning
    • 87
    • PDF
    Selfie: Self-supervised Pretraining for Image Embedding
    • 35
    • PDF
    Scaling and Benchmarking Self-Supervised Visual Representation Learning
    • 112
    • PDF
    Demystifying Contrastive Self-Supervised Learning: Invariances, Augmentations and Dataset Biases
    • 22
    • PDF
    A Simple Framework for Contrastive Learning of Visual Representations
    • 798
    • Highly Influential
    • PDF