Recurrent Neural Network Regularization

Abstract

We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique for regularizing neu-ral networks, does not work well with RNNs and LSTMs. In this paper, we show how to correctly apply dropout to LSTMs, and show that it substantially reduces overfitting on a variety of tasks. These tasks include language modeling, speech recognition, and machine translation.

Extracted Key Phrases

7 Figures and Tables

01002003002014201520162017
Citations per Year

446 Citations

Semantic Scholar estimates that this publication has received between 381 and 528 citations based on the available data.

See our FAQ for additional information.