#### Filter Results:

- Full text PDF available (207)

#### Publication Year

1976

2018

- This year (6)
- Last 5 years (36)
- Last 10 years (119)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Brain Region

#### Cell Type

#### Data Set Used

#### Method

Learn More

- Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton
- NIPS
- 2012

We trained a large, deep convolutional neural network to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes. On the test data, we… (More)

- Geoffrey E. Hinton, Simon Osindero, Yee Whye Teh
- Neural Computation
- 2006

We show how to use complementary priors to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. Using complementary… (More)

- Nitish Srivastava, Geoffrey E. Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov
- Journal of Machine Learning Research
- 2014

Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Large networks are also slow to use, making… (More)

- Geoffrey E. Hinton, Ruslan Salakhutdinov
- Science
- 2006

High-dimensional data can be converted to low-dimensional codes by training a multilayer neural network with a small central layer to reconstruct high-dimensional input vectors. Gradient descent can… (More)

- David E. Rumelhart, Geoffrey E. Hinton, Ronald J. Williams
- Nature
- 1986

We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure… (More)

- Geoffrey E. Hinton
- Neural Computation
- 2002

It is possible to combine multiple latent-variable models of the same data by multiplying their probability distributions together and then renormalizing. This way of combining individual expert… (More)

When a large feedforward neural network is trained on a small training set, it typically performs poorly on held-out test data. This “overfitting” is greatly reduced by randomly omitting half of the… (More)

We present a new technique called “t-SNE” that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map. The technique is a variation of Stochastic… (More)

- Vinod Nair, Geoffrey E. Hinton
- ICML
- 2010

Restricted Boltzmann machines were developed using binary stochastic hidden units. These can be generalized by replacing each binary unit by an infinite number of copies that all have the same… (More)

- Geoffrey E. Hinton
- Neural Networks: Tricks of the Trade
- 2012