• Corpus ID: 8102341

Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning

@inproceedings{Sajjadi2016RegularizationWS,
  title={Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning},
  author={Mehdi S. M. Sajjadi and Mehran Javanmardi and Tolga Tasdizen},
  booktitle={NIPS},
  year={2016}
}
Effective convolutional neural networks are trained on large sets of labeled data. However, creating large labeled datasets is a very costly and time-consuming task. Semi-supervised learning uses unlabeled data to train a model with higher accuracy when there is a limited set of labeled data available. In this paper, we consider the problem of semi-supervised learning with convolutional neural networks. Techniques such as randomized data augmentation, dropout and random max-pooling provide… 

Figures from this paper

Label Propagation for Deep Semi-Supervised Learning

This work employs a transductive label propagation method that is based on the manifold assumption to make predictions on the entire dataset and use these predictions to generate pseudo-labels for the unlabeled data and train a deep neural network.

Deep Semi-Supervised Learning

This paper proposes a deep semi-supervised learning (DSSL) self-training method that utilizes the strengths of both supervised and unsupervisedLearning within a single model and measures the efficacy of the proposed method on semi- supervised visual object classification tasks.

Temporal Ensembling for Semi-Supervised Learning

Self-ensembling is introduced, where it is shown that this ensemble prediction can be expected to be a better predictor for the unknown labels than the output of the network at the most recent training epoch, and can thus be used as a target for training.

Learning by Association — A Versatile Semi-Supervised Training Method for Neural Networks

This work proposes a new framework for semi-supervised training of deep neural networks inspired by learning in humans and demonstrates the capabilities of learning by association on several data sets and shows that it can improve performance on classification tasks tremendously by making use of additionally available unlabeled data.

Unsupervised Data Augmentation for Consistency Training

A new perspective on how to effectively noise unlabeled examples is presented and it is argued that the quality of noising, specifically those produced by advanced data augmentation methods, plays a crucial role in semi-supervised learning.

Tri-net for Semi-Supervised Deep Learning

This paper proposes tri-net, a deep neural network which is able to use massive unlabeled data to help learning with limited labeled data, and considers model initialization, diversity augmentation and pseudo-label editing simultaneously simultaneously.

Semi-Supervised Learning with Self-Supervised Networks

This work presents a conceptually simple yet effective semi- supervised algorithm based on self-supervised learning to combine semantic feature representations from unlabeled data, and demonstrates results competitive with, and in some cases exceeding, prior state of the art results.

Unsupervised Data Augmentation

UDA has a small twist in that it makes use of harder and more realistic noise generated by state-of-the-art data augmentation methods, which leads to substantial improvements on six language tasks and three vision tasks even when the labeled set is extremely small.

Semi-supervised Learning Using Siamese Networks

A new training method for semi-supervised learning that is based on similarity function learning using a Siamese network to obtain a suitable embedding and an empirical study of this iterative self-training algorithm is performed.

Pseudo-Labeling Using Gaussian Process for Semi-Supervised Deep Learning

This work proposes a simple and novel method of utilizing unlabeled data for semi-supervised learning to improve the performance of the deep learning model, and generalizes proposed pseudo-labeling method to any classifier having good performance and gives some advice for pseudo- labeling method selection.
...

References

SHOWING 1-10 OF 42 REFERENCES

Mutual exclusivity loss for semi-supervised deep learning

An unsupervised regularization term is proposed that explicitly forces the classifier's prediction for multiple classes to be mutually-exclusive and effectively guides the decision boundary to lie on the low density space between the manifolds corresponding to different classes of data.

Discriminative Unsupervised Feature Learning with Convolutional Neural Networks

This paper presents an approach for training a convolutional neural network using only unlabeled data and trains the network to discriminate between a set of surrogate classes, finding that this simple feature learning algorithm is surprisingly successful when applied to visual object recognition.

Learning Classification with Unlabeled Data

This paper shows that minimizing the disagreement between the outputs of networks processing patterns from these different modalities is a sensible approximation to minimizing the number of misclassifications in each modality, and leads to similar results.

Semi-supervised Learning with Ladder Networks

This work builds on top of the Ladder network proposed by Valpola which is extended by combining the model with supervision and shows that the resulting model reaches state-of-the-art performance in semi-supervised MNIST and CIFAR-10 classification in addition to permutation-invariant MNIST classification with all labels.

Learning Multiple Layers of Features from Tiny Images

It is shown how to train a multi-layer generative model that learns to extract meaningful features which resemble those found in the human visual cortex, using a novel parallelization algorithm to distribute the work among multiple machines connected on a network.

Unsupervised Visual Representation Learning by Context Prediction

It is demonstrated that the feature representation learned using this within-image context indeed captures visual similarity across images and allows us to perform unsupervised visual discovery of objects like cats, people, and even birds from the Pascal VOC 2011 detection dataset.

Semi-Supervised Learning

This first comprehensive overview of semi-supervised learning presents state-of-the-art algorithms, a taxonomy of the field, selected applications, benchmark experiments, and perspectives on ongoing and future research.

Deep learning via semi-supervised embedding

We show how nonlinear embedding algorithms popular for use with shallow semi-supervised learning techniques such as kernel methods can be applied to deep multilayer architectures, either as a

Spatially-sparse convolutional neural networks

A CNN for processing spatially-sparse inputs, motivated by the problem of online handwriting recognition, and applying a deep convolutional network using sparsity has resulted in a substantial reduction in test error on the CIFAR small picture datasets.

ImageNet classification with deep convolutional neural networks

A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.