Author pages are created from data sourced from our academic publisher partnerships and public sources.
Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages
We introduce three memory-augmented Recurrent Neural Networks (MARNNs) and explore their capabilities on a series of simple language modeling tasks whose solutions require stack-based mechanisms. We… Expand
LSTM Networks Can Perform Dynamic Counting
In this paper, we systematically assess the ability of standard recurrent networks to perform dynamic counting and to encode hierarchical representations. All the neural models in our experiments are… Expand
On Evaluating the Generalization of LSTM Models in Formal Languages
Recurrent Neural Networks (RNNs) are theoretically Turing-complete and established themselves as a dominant model for language processing. Yet, there still remains an uncertainty regarding their… Expand