Representation Learning: A Review and New Perspectives

@article{Bengio2013RepresentationLA,
  title={Representation Learning: A Review and New Perspectives},
  author={Yoshua Bengio and Aaron C. Courville and Pascal Vincent},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  year={2013},
  volume={35},
  pages={1798-1828}
}
The success of machine learning algorithms generally depends on data representation, and we hypothesize that this is because different representations can entangle and hide more or less the different explanatory factors of variation behind the data. Although specific domain knowledge can be used to help design representations, learning with generic priors can also be used, and the quest for AI is motivating the design of more powerful representation-learning algorithms implementing such priors… Expand
A Modular Theory of Feature Learning
An Overview on Data Representation Learning: From Traditional Feature Learning to Recent Deep Learning
The Role of the Information Bottleneck in Representation Learning
Sparse, hierarchical and shared-factors priors for representation learning
Autonomous Learning of Representations
Relation-Guided Representation Learning
Representational learning for sonar ATR
  • J. Isaacs
  • Computer Science, Engineering
  • Defense + Security Symposium
  • 2014
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 286 REFERENCES
Unsupervised and Transfer Learning Challenge: a Deep Learning Approach
Deep Learning of Representations for Unsupervised and Transfer Learning
  • Yoshua Bengio
  • Computer Science
  • ICML Unsupervised and Transfer Learning
  • 2012
Sparse Feature Learning for Deep Belief Networks
The Manifold Tangent Classifier
Extracting and composing robust features with denoising autoencoders
Large-Scale Learning of Embeddings with Reconstruction Sampling
Why Does Unsupervised Pre-training Help Deep Learning?
On deep generative models with applications to recognition
...
1
2
3
4
5
...