• Publications
  • Influence
Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages
TLDR
This work provides the first demonstration of neural networks recognizing the generalized Dyck languages, which express the core of what it means to be a language with hierarchical structure.
LSTM Networks Can Perform Dynamic Counting
TLDR
This work is the first study to introduce the shuffle languages to analyze the computational power of neural networks, and shows that a single-layer LSTM with only one hidden unit is practically sufficient for recognizing the Dyck-1 language.
On Evaluating the Generalization of LSTM Models in Formal Languages
TLDR
This paper empirically evaluates the inductive learning capabilities of Long Short-Term Memory networks, a popular extension of simple RNNs, to learn simple formal languages.