Dropout (neural networks)

Known as: Dropout, Dropout training 
Dropout is a regularization technique for reducing overfitting in neural networks by preventing complex co-adaptations on training data. It is a very… (More)
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2017
Highly Cited
2017
We explore a recently proposed Variational Dropout technique that provided an elegant Bayesian interpretation to Gaussian Dropout… (More)
  • figure 1
  • table 1
  • table 2
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
2016
Highly Cited
2016
Recurrent neural networks (RNNs) stand at the forefront of many recent developments in deep learning. Yet a major difficulty with… (More)
  • figure 1
  • table 1
  • figure 2
  • figure 3
  • figure 4
Is this relevant?
Highly Cited
2016
Highly Cited
2016
Deep learning tools have gained tremendous attention in applied machine learning. However such tools for regression and… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • table 1
Is this relevant?
Highly Cited
2015
Highly Cited
2015
Deep Neural Networks (DNN) have achieved state-of-the-art results in a wide range of tasks, with the best results obtained with… (More)
  • table 1
  • table 2
  • figure 1
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
2014
Highly Cited
2014
Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious… (More)
  • figure 1
  • figure 2
  • figure 3
  • table 1
  • table 2
Is this relevant?
Highly Cited
2014
Highly Cited
2014
We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units… (More)
  • figure 1
  • table 1
  • figure 2
  • figure 3
  • table 2
Is this relevant?
Highly Cited
2013
Highly Cited
2013
We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within… (More)
  • figure 1
  • table 1
  • table 2
  • figure 2
  • table 3
Is this relevant?
Highly Cited
2013
Highly Cited
2013
Recently, pre-trained deep neural networks (DNNs) have outperformed traditional acoustic models based on Gaussian mixture models… (More)
  • table 1
  • table 2
Is this relevant?
Highly Cited
2012
Highly Cited
2012
We trained a large, deep convolutional neural network to classify the 1.2 million high-resolution images in the ImageNet LSVRC… (More)
  • figure 1
  • figure 2
  • figure 3
  • table 2
  • figure 4
Is this relevant?
Highly Cited
2012
Highly Cited
2012
When a large feedforward neural network is trained on a small training set, it typically performs poorly on held-out test data… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?