Dropout (neural networks)
Semantic Scholar uses AI to extract papers important to this topic.
Recurrent neural networks (RNNs) stand at the forefront of many recent developments in deep learning. Yet a major difficulty with… Expand Deep learning tools have gained tremendous attention in applied machine learning. However such tools for regression and… Expand We introduce a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the… Expand Deep Neural Networks (DNN) have achieved state-of-the-art results in a wide range of tasks, with the best results obtained with… Expand Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious… Expand We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units… Expand Recently, pre-trained deep neural networks (DNNs) have outperformed traditional acoustic models based on Gaussian mixture models… Expand We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within… Expand We trained a large, deep convolutional neural network to classify the 1.2 million high-resolution images in the ImageNet LSVRC… Expand When a large feedforward neural network is trained on a small training set, it typically performs poorly on held-out test data… Expand