Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

Dropout (neural networks)

Known as: Dropout, Dropout training 
Dropout is a regularization technique for reducing overfitting in neural networks by preventing complex co-adaptations on training data. It is a very… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2016
Highly Cited
2016
Recurrent neural networks (RNNs) stand at the forefront of many recent developments in deep learning. Yet a major difficulty with… Expand
  • figure 1
  • table 1
  • figure 2
  • figure 3
  • figure 4
Highly Cited
2016
Highly Cited
2016
Deep learning tools have gained tremendous attention in applied machine learning. However such tools for regression and… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • table 1
Highly Cited
2015
Highly Cited
2015
We introduce a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the… Expand
  • figure 1
  • table 1
  • figure 2
  • figure 3
  • table 2
Highly Cited
2015
Highly Cited
2015
Deep Neural Networks (DNN) have achieved state-of-the-art results in a wide range of tasks, with the best results obtained with… Expand
  • table 1
  • table 2
  • figure 1
  • figure 2
  • figure 3
Highly Cited
2014
Highly Cited
2014
Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious… Expand
  • figure 1
  • figure 2
  • figure 3
  • table 1
  • table 2
Highly Cited
2014
Highly Cited
2014
We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units… Expand
  • figure 1
  • table 1
  • figure 2
  • figure 3
  • table 2
Highly Cited
2013
Highly Cited
2013
Recently, pre-trained deep neural networks (DNNs) have outperformed traditional acoustic models based on Gaussian mixture models… Expand
  • table 1
  • table 2
Highly Cited
2013
Highly Cited
2013
We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within… Expand
  • figure 1
  • table 1
  • table 2
  • figure 2
  • table 3
Highly Cited
2012
Highly Cited
2012
We trained a large, deep convolutional neural network to classify the 1.2 million high-resolution images in the ImageNet LSVRC… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Highly Cited
2012
Highly Cited
2012
When a large feedforward neural network is trained on a small training set, it typically performs poorly on held-out test data… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5