LSTM recurrent networks learn simple context-free and context-sensitive languages

  title={LSTM recurrent networks learn simple context-free and context-sensitive languages},
  author={Felix A. Gers and J{\"u}rgen Schmidhuber},
  journal={IEEE transactions on neural networks},
  volume={12 6},
Previous work on learning regular languages from exemplary training sequences showed that long short-term memory (LSTM) outperforms traditional recurrent neural networks (RNNs). We demonstrate LSTMs superior performance on context-free language benchmarks for RNNs, and show that it works even better than previous hardwired or highly specialized architectures. To the best of our knowledge, LSTM variants are also the first RNNs to learn a simple context-sensitive language, namely a(n)b(n)c(n). 


Publications citing this paper.

419 Citations

Citations per Year
Semantic Scholar estimates that this publication has 419 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…