Author pages are created from data sourced from our academic publisher partnerships and public sources.
Share This Author
Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages
This work provides the first demonstration of neural networks recognizing the generalized Dyck languages, which express the core of what it means to be a language with hierarchical structure.
LSTM Networks Can Perform Dynamic Counting
- Mirac Suzgun, Sebastian Gehrmann, Yonatan Belinkov, S. Shieber
- Computer ScienceProceedings of the Workshop on Deep Learning and…
- 9 June 2019
This work is the first study to introduce the shuffle languages to analyze the computational power of neural networks, and shows that a single-layer LSTM with only one hidden unit is practically sufficient for recognizing the Dyck-1 language.
On Evaluating the Generalization of LSTM Models in Formal Languages
This paper empirically evaluates the inductive learning capabilities of Long Short-Term Memory networks, a popular extension of simple RNNs, to learn simple formal languages.
Formal Language Theory as a Framework for Understanding the Limitations of Recurrent Neural Networks
- Mirac Suzgun
- Computer Science
- 17 June 2020