• Publications
  • Influence
Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages
We introduce three memory-augmented Recurrent Neural Networks (MARNNs) and explore their capabilities on a series of simple language modeling tasks whose solutions require stack-based mechanisms. WeExpand
  • 7
  • 4
  • Open Access
LSTM Networks Can Perform Dynamic Counting
In this paper, we systematically assess the ability of standard recurrent networks to perform dynamic counting and to encode hierarchical representations. All the neural models in our experiments areExpand
  • 15
  • 2
  • Open Access
On Evaluating the Generalization of LSTM Models in Formal Languages
Recurrent Neural Networks (RNNs) are theoretically Turing-complete and established themselves as a dominant model for language processing. Yet, there still remains an uncertainty regarding theirExpand
  • 17
  • 1
  • Open Access