Corpus ID: 219721239

Big Self-Supervised Models are Strong Semi-Supervised Learners

@article{Chen2020BigSM,
  title={Big Self-Supervised Models are Strong Semi-Supervised Learners},
  author={Ting Chen and Simon Kornblith and Kevin Swersky and Mohammad Norouzi and Geoffrey E. Hinton},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.10029}
}
One paradigm for learning from few labeled examples while making best use of a large amount of unlabeled data is unsupervised pretraining followed by supervised fine-tuning. Although this paradigm uses unlabeled data in a task-agnostic way, in contrast to most previous approaches to semi-supervised learning for computer vision, we show that it is surprisingly effective for semi-supervised learning on ImageNet. A key ingredient of our approach is the use of a big (deep and wide) network during… Expand

Figures and Tables from this paper

Streaming Self-Training via Domain-Agnostic Unlabeled Images
FROST: Faster and more Robust One-shot Semi-supervised Training
SEED: Self-supervised Distillation For Visual Representation
On the Marginal Benefit of Active Learning: Does Self-Supervision Eat Its Cake?
Are Fewer Labels Possible for Few-shot Learning?
Self-supervised Pretraining of Visual Features in the Wild
Revisiting Unsupervised Meta-Learning: Amplifying or Compensating for the Characteristics of Few-Shot Tasks
Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 79 REFERENCES
Billion-scale semi-supervised learning for image classification
Temporal Ensembling for Semi-Supervised Learning
Revisiting Self-Supervised Visual Representation Learning
Introduction to Semi-Supervised Learning
Scaling and Benchmarking Self-Supervised Visual Representation Learning
...
1
2
3
4
5
...