Corpus ID: 195873898

Unsupervised Data Augmentation for Consistency Training

@article{Xie2020UnsupervisedDA,
  title={Unsupervised Data Augmentation for Consistency Training},
  author={Qizhe Xie and Zihang Dai and E. Hovy and Minh-Thang Luong and Quoc V. Le},
  journal={arXiv: Learning},
  year={2020}
}
  • Qizhe Xie, Zihang Dai, +2 authors Quoc V. Le
  • Published 2020
  • Computer Science, Mathematics
  • arXiv: Learning
  • Semi-supervised learning lately has shown much promise in improving deep learning models when labeled data is scarce. Common among recent approaches is the use of consistency training on a large amount of unlabeled data to constrain model predictions to be invariant to input noise. In this work, we present a new perspective on how to effectively noise unlabeled examples and argue that the quality of noising, specifically those produced by advanced data augmentation methods, plays a crucial role… CONTINUE READING
    263 Citations
    On The Consistency Training for Open-Set Semi-Supervised Learning
    • Highly Influenced
    • PDF
    Hybrid Consistency Training with Prototype Adaptation for Few-Shot Learning
    • PDF
    Pseudo-Representation Labeling Semi-Supervised Learning
    • PDF
    Does Data Augmentation Benefit from Split BatchNorms
    • PDF
    Milking CowMask for Semi-Supervised Image Classification
    • 3
    • Highly Influenced
    • PDF
    Distilling Effective Supervision From Severe Label Noise
    • 16
    • PDF

    References

    SHOWING 1-10 OF 84 REFERENCES
    Realistic Evaluation of Deep Semi-Supervised Learning Algorithms
    • 313
    • PDF
    Temporal Ensembling for Semi-Supervised Learning
    • 591
    • Highly Influential
    • PDF
    RandAugment: Practical data augmentation with no separate search
    • 96
    Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning
    • 295
    • PDF
    Unifying semi-supervised and robust learning by mixup
    • 11
    Smooth Neighbors on Teacher Graphs for Semi-Supervised Learning
    • 118
    • PDF
    Semi-Supervised Sequence Modeling with Cross-View Training
    • 154
    • PDF
    Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
    • 783
    • PDF