Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks
@inproceedings{Lee2013PseudoLabelT, title={Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks}, author={Dong-Hyun Lee}, year={2013} }
We propose the simple and efficient method of semi-supervised learning for deep neural networks. Basically, the proposed network is trained in a supervised fashion with labeled and unlabeled data simultaneously. For unlabeled data, Pseudo-Labels, just picking up the class which has the maximum predicted probability, are used as if they were true labels. This is in effect equivalent to Entropy Regularization. It favors a low-density separation between classes, a commonly assumed prior for semi…
1,434 Citations
Pseudo-Labeling Using Gaussian Process for Semi-Supervised Deep Learning
- Computer Science2018 IEEE International Conference on Big Data and Smart Computing (BigComp)
- 2018
This work proposes a simple and novel method of utilizing unlabeled data for semi-supervised learning to improve the performance of the deep learning model, and generalizes proposed pseudo-labeling method to any classifier having good performance and gives some advice for pseudo- labeling method selection.
Semi-supervised Deep Learning Using Improved Unsupervised Discriminant Projection
- Computer ScienceICONIP
- 2019
Modify the unsupervised discriminant projection algorithm from dimension reduction and apply it as a regularization term to propose a new semi-supervised deep learning algorithm, which is able to utilize both the local and nonlocal distribution of abundant unlabeled samples to improve classification performance.
Pseudo-label Selection for Deep Semi-supervised Learning
- Computer Science2020 IEEE International Conference on Progress in Informatics and Computing (PIC)
- 2020
A confidence-based pseudo-labeling method to choose high quality pseudo-labels with a novel uncertainty model is proposed and results show that the proposed method is able to improve the performance for deep semi-supervised learning.
Learning by Association — A Versatile Semi-Supervised Training Method for Neural Networks
- Computer Science2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2017
This work proposes a new framework for semi-supervised training of deep neural networks inspired by learning in humans and demonstrates the capabilities of learning by association on several data sets and shows that it can improve performance on classification tasks tremendously by making use of additionally available unlabeled data.
Semi-supervised learning with connectivity-driven convolutional neural networks
- Computer SciencePattern Recognit. Lett.
- 2019
Semi-supervised Deep Learning for Fully Convolutional Networks
- Computer ScienceMICCAI
- 2017
This work lifts the concept of auxiliary manifold embedding for semi-supervised learning to FCNs with the help of Random Feature Embedding and leverages the proposed framework for the purpose of domain adaptation.
Deep Growing Learning
- Computer Science2017 IEEE International Conference on Computer Vision (ICCV)
- 2017
This work proposes a bio-inspired SSL framework on deep neural network, namely Deep Growing Learning (DGL), which is formulated as an EM-like process, where the deep network alternately iterates between automatically growing convolutional layers and selecting reliable pseudo-labeled data for training.
Two Semi-supervised Approaches to Malware Detection with Neural Networks
- Computer ScienceITAT
- 2020
This paper compares two semi-supervised algorithms for deep neural networks on a large real-world malware dataset and evaluates the performance of a rather straightforward method called Pseudo-labeling, which uses unlabeled samples, classified with high confidence, as if they were the actual labels.
Semi-supervised Learning with Contrastive Predicative Coding
- Computer ScienceArXiv
- 2019
This paper explores recently developed contrastive predictive coding technique to improve discriminative power of deep learning models when a large portion of labels are absent, and proposes two models, cpc-SSL and a class conditional variant~.
Semi-supervised Learning Using Siamese Networks
- Computer ScienceAustralasian Conference on Artificial Intelligence
- 2019
A new training method for semi-supervised learning that is based on similarity function learning using a Siamese network to obtain a suitable embedding and an empirical study of this iterative self-training algorithm is performed.
References
SHOWING 1-10 OF 17 REFERENCES
Semi-supervised learning of compact document representations with deep networks
- Computer ScienceICML '08
- 2008
An algorithm to learn text document representations based on semi-supervised autoencoders that are stacked to form a deep network that can be trained efficiently on partially labeled corpora, producing very compact representations of documents, while retaining as much class information and joint word statistics as possible.
Extracting and composing robust features with denoising autoencoders
- Computer ScienceICML '08
- 2008
This work introduces and motivate a new training principle for unsupervised learning of a representation based on the idea of making the learned representations robust to partial corruption of the input pattern.
Semi-Supervised Classification by Low Density Separation
- Computer ScienceAISTATS
- 2005
Three semi-supervised algorithms are proposed: deriving graph-based distances that emphazise low density regions between clusters, followed by training a standard SVM, and optimizing the Transductive SVM objective function by gradient descent.
Deep learning via semi-supervised embedding
- Computer ScienceICML '08
- 2008
We show how nonlinear embedding algorithms popular for use with shallow semi-supervised learning techniques such as kernel methods can be applied to deep multilayer architectures, either as a…
Classification using discriminative restricted Boltzmann machines
- Computer ScienceICML '08
- 2008
This paper presents an evaluation of different learning algorithms for RBMs which aim at introducing a discriminative component to RBM training and improve their performance as classifiers, and demonstrates how discriminating RBMs can also be successfully employed in a semi-supervised setting.
Representation Learning: A Review and New Perspectives
- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2013
Recent work in the area of unsupervised feature learning and deep learning is reviewed, covering advances in probabilistic models, autoencoders, manifold learning, and deep networks.
Improving neural networks by preventing co-adaptation of feature detectors
- Computer ScienceArXiv
- 2012
When a large feedforward neural network is trained on a small training set, it typically performs poorly on held-out test data. This "overfitting" is greatly reduced by randomly omitting half of the…
Why Does Unsupervised Pre-training Help Deep Learning?
- Computer ScienceAISTATS
- 2010
The results suggest that unsupervised pre-training guides the learning towards basins of attraction of minima that support better generalization from the training data set; the evidence from these results supports a regularization explanation for the effect of pre- training.
Reducing the Dimensionality of Data with Neural Networks
- Computer ScienceScience
- 2006
This work describes an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data.
The Manifold Tangent Classifier
- Computer ScienceNIPS
- 2011
A representation learning algorithm can be stacked to yield a deep architecture and it is shown how it builds a topological atlas of charts, each chart being characterized by the principal singular vectors of the Jacobian of a representation mapping.