ImageNet classification with deep convolutional neural networks
- A. Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton
- Computer ScienceCommunications of the ACM
- 3 December 2012
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Dropout: a simple way to prevent neural networks from overfitting
- Nitish Srivastava, Geoffrey E. Hinton, A. Krizhevsky, Ilya Sutskever, R. Salakhutdinov
- Computer ScienceJournal of machine learning research
- 2014
It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
A Simple Framework for Contrastive Learning of Visual Representations
- Ting Chen, Simon Kornblith, Mohammad Norouzi, Geoffrey E. Hinton
- Computer ScienceInternational Conference on Machine Learning
- 13 February 2020
It is shown that composition of data augmentations plays a critical role in defining effective predictive tasks, and introducing a learnable nonlinear transformation between the representation and the contrastive loss substantially improves the quality of the learned representations, and contrastive learning benefits from larger batch sizes and more training steps compared to supervised learning.
Distilling the Knowledge in a Neural Network
- Geoffrey E. Hinton, Oriol Vinyals, J. Dean
- Computer SciencearXiv.org
- 9 March 2015
This work shows that it can significantly improve the acoustic model of a heavily used commercial system by distilling the knowledge in an ensemble of models into a single model and introduces a new type of ensemble composed of one or more full models and many specialist models which learn to distinguish fine-grained classes that the full models confuse.
A Fast Learning Algorithm for Deep Belief Nets
- Geoffrey E. Hinton, Simon Osindero, Y. Teh
- Computer ScienceNeural Computation
- 1 July 2006
A fast, greedy algorithm is derived that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory.
Rectified Linear Units Improve Restricted Boltzmann Machines
- Vinod Nair, Geoffrey E. Hinton
- Computer ScienceInternational Conference on Machine Learning
- 21 June 2010
Restricted Boltzmann machines were developed using binary stochastic hidden units that learn features that are better for object recognition on the NORB dataset and face verification on the Labeled Faces in the Wild dataset.
Reducing the Dimensionality of Data with Neural Networks
- Geoffrey E. Hinton, R. Salakhutdinov
- Computer ScienceScience
- 28 July 2006
This work describes an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data.
Visualizing Data using t-SNE
- L. Maaten, Geoffrey E. Hinton
- Computer Science
- 2008
A new technique called t-SNE that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map, a variation of Stochastic Neighbor Embedding that is much easier to optimize, and produces significantly better visualizations by reducing the tendency to crowd points together in the center of the map.
Learning internal representations by error propagation
- D. Rumelhart, Geoffrey E. Hinton, Ronald J. Williams
- Biology
- 3 January 1986
Learning representations by back-propagating errors
- D. Rumelhart, Geoffrey E. Hinton, Ronald J. Williams
- Computer ScienceNature
- 1 October 1986
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
...
...