LSTM recurrent networks learn simple context-free and context-sensitive languages

@article{Gers2001LSTMRN,
  title={LSTM recurrent networks learn simple context-free and context-sensitive languages},
  author={Felix A. Gers and J{\"u}rgen Schmidhuber},
  journal={IEEE transactions on neural networks},
  year={2001},
  volume={12 6},
  pages={
          1333-40
        }
}
Previous work on learning regular languages from exemplary training sequences showed that long short-term memory (LSTM) outperforms traditional recurrent neural networks (RNNs). We demonstrate LSTMs superior performance on context-free language benchmarks for RNNs, and show that it works even better than previous hardwired or highly specialized architectures. To the best of our knowledge, LSTM variants are also the first RNNs to learn a simple context-sensitive language, namely a(n)b(n)c(n). 

Citations

Publications citing this paper.

419 Citations

050100'00'04'09'14'19
Citations per Year
Semantic Scholar estimates that this publication has 419 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…