Distributed Representations, Simple Recurrent Networks, And Grammatical Structure

  title={Distributed Representations, Simple Recurrent Networks, And Grammatical Structure},
  author={Jeffrey L. Elman},
  journal={Machine Learning},
  • J. Elman
  • Published 1 September 1991
  • Linguistics
  • Machine Learning
In this paper three problems for a connectionist account of language are considered:1. What is the nature of linguistic representations?2. How can complex structural relationships such as constituent structure be represented?3. How can the apparently open-ended nature of language be accommodated by a fixed-resource system?Using a prediction task, a simple recurrent network (SRN) is trained on multiclausal sentences which contain multiply-embedded relative clauses. Principal component analysis… 
Parsing Embedded Clauses with Distributed Neural Networks
A distributed neural network model called SPEC for processing sentences with recursive relative clauses is described, which exhibits plausible memory degradation as the depth of the center embeddings increases, its memory is primed by earlier constituents, and its performance is aided by semantic constraints between the constituents.
A neural network model for acquisition of semantic structures
  • S.W.K. Chan, J. Franklin
  • Computer Science
    Proceedings of ICSIPNN '94. International Conference on Speech, Image Processing and Neural Networks
  • 1994
A neural network model in which simple recurrent network and recursive auto-association memory are combined to acquire the semantic structures from sentence constituents is described, which imposes no prior limit on sentence structures.
On the implicit acquisition of a context-free grammar by a simple recurrent neural network
Holistic processing of hierarchical structures in connectionist networks
The ability to distinguish and perform a number of different structure-sensitive operations is one step towards a connectionist architecture that is capable of modelling complex high-level cognitive tasks such as natural language processing and logical inference.
Distinct patterns of syntactic agreement errors in recurrent networks and humans
It is concluded that at least in some respects the syntactic representations acquired by RNNs are fundamentally different from those used by humans.
Constituency and recursion in language
Simulation results demonstrated that the SRNs exhibited the same kind of qualitative processing difficulties as humans on these two types of complex recursive constructions, suggesting that context-sensitivity may be a more pervasive feature of language processing than typically assumed by symbolic approaches.
Integrating Linguistic Primitives in Learning Context-Dependent Representation
The paper presents an explicit connectionist-inspired, language learning model in which the process of settling on a particular interpretation for a sentence emerges from the interaction of a set of
Connectionist natural language parsing
Cognitive neurorobotics and self in the shared world, a focused review of ongoing research
Through brain-inspired modeling studies, cognitive neurorobotics aims to resolve dynamics essential to different emergent phenomena at the level of embodied agency in an object environment shared


Meaning and the Structure of Language
THE NON-LINGUIST who has conscientiously tried to keep abreast of developments in linguistic theory may well be ready to give up. Linguistics, especially transformational grammar, has matured
On the proper treatment of connectionism
Abstract A set of hypotheses is formulated for a connectionist approach to cognitive modeling. These hypotheses are shown to be incompatible with the hypotheses underlying traditional cognitive
The Language of Thought
In a compelling defense of the speculative approach to the philosophy of mind, Jerry Fodor argues that, while our best current theories of cognitive psychology view many higher processes as
Women, fire, and dangerous things : what categories reveal about the mind
"Its publication should be a major event for cognitive linguistics and should pose a major challenge for cognitive science. In addition, it should have repercussions in a variety of disciplines,
A study of the ability to decode grammatically novel sentences
Putting together connectionism – again
On Linguistic Competence
Competence is one of the central conccpts in the theorizing of the transformational school. The term has been introduced by Chomsky, who uses it mainly to characterize those issues that are the
PDP models and general issues in cognitive science