Corpus ID: 220302492

A Survey on Self-supervised Pre-training for Sequential Transfer Learning in Neural Networks

@article{Mao2020ASO,
  title={A Survey on Self-supervised Pre-training for Sequential Transfer Learning in Neural Networks},
  author={H. H. Mao},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.00800}
}
  • H. H. Mao
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
Deep neural networks are typically trained under a supervised learning framework where a model learns a single task using labeled data. Instead of relying solely on labeled data, practitioners can harness unlabeled or related data to improve model performance, which is often more accessible and ubiquitous. Self-supervised pre-training for transfer learning is becoming an increasingly popular technique to improve state-of-the-art results using unlabeled data. It involves first pre-training a… Expand
2 Citations
Text-conditioned Transformer for automatic pronunciation error detection

References

SHOWING 1-10 OF 126 REFERENCES
Self-supervised Visual Feature Learning with Deep Neural Networks: A Survey
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Why Does Unsupervised Pre-training Help Deep Learning?
Semi-supervised Sequence Learning
A Survey on Transfer Learning
Unsupervised Feature Learning via Non-parametric Instance Discrimination
Semi-supervised Learning with Ladder Networks
Multi-Task Deep Neural Networks for Natural Language Understanding
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
...
1
2
3
4
5
...