Can recurrent neural networks learn natural language grammars?
@article{Lawrence1996CanRN, title={Can recurrent neural networks learn natural language grammars?}, author={S. Lawrence and C. L. Giles and Sandiway Fong}, journal={Proceedings of International Conference on Neural Networks (ICNN'96)}, year={1996}, volume={4}, pages={1853-1858 vol.4} }
Recurrent neural networks are complex parametric dynamic systems that can exhibit a wide range of different behavior. We consider the task of grammatical inference with recurrent neural networks. Specifically, we consider the task of classifying natural language sentences as grammatical or ungrammatical: can a recurrent neural network be made to exhibit the same kind of discriminatory power which is provided by the principles and parameters linguistic framework, or government and binding theory… Expand
Topics from this paper
20 Citations
Noisy Time Series Prediction using Symbolic Representation and Recurrent Neural Network Grammatical Inference
- Computer Science
- 1998
- 26
- PDF
Architectures of neural networks applied for LVCSR language modeling
- Computer Science
- Neurocomputing
- 2014
- 6
Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
- Computer Science
- Transactions of the Association for Computational Linguistics
- 2016
- 451
- PDF
References
SHOWING 1-10 OF 37 REFERENCES
On the Applicability of Neural Network and Machine Learning Methodologies to Natural Language Processing
- Computer Science
- 1998
- 13
Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks
- Computer Science
- Neural Computation
- 1992
- 473
Finite State Automata and Simple Recurrent Networks
- Computer Science, Mathematics
- Neural Computation
- 1989
- 491
- PDF
Distributed Representations, Simple Recurrent Networks, and Grammatical Structure
- Mathematics, Computer Science
- Mach. Learn.
- 1991
- 403
- PDF
Three models for the description of language
- Mathematics, Computer Science
- IRE Trans. Inf. Theory
- 1956
- 975
- PDF
A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
- Computer Science
- Neural Computation
- 1989
- 3,410
Learning and Applying Contextual Constraints in Sentence Comprehension
- Computer Science
- Artif. Intell.
- 1990
- 282
- PDF
Induction of Finite-State Languages Using Second-Order Recurrent Networks
- Mathematics, Computer Science
- Neural Computation
- 1992
- 213
On the Computational Power of Neural Nets
- Computer Science
- J. Comput. Syst. Sci.
- 1995
- 772
- Highly Influential