Finding Structure in Time
@article{Elman1990FindingSI, title={Finding Structure in Time}, author={Jeffrey L. Elman}, journal={Cogn. Sci.}, year={1990}, volume={14}, pages={179-211} }
Time underlies many interesting human behaviors. [] Key Method In this approach, hidden unit patterns are fed back to themselves; the internal representations which develop thus reflect task demands in the context of prior internal states. A set of simulations is reported which range from relatively simple problems (temporal version of XOR) to discovering syntactic/semantic features for words. The networks are able to learn interesting internal representations which incorporate task demands with memory…
Figures and Tables from this paper
9,855 Citations
Incremental sequence learning
- Psychology, Biology
- 1996
An alternative model based on Maskara & Noetzel's (1991) Auto-Associative Recurrent Network is suggested as a way to overcome the SRN model’s failure to account for human performance in several experimental situations meant to test the model's specific predictions.
Doing Without Schema Hierarchies : A Recurrent Connectionist Approach to Routine Sequential Action and Its Pathologies
- Biology
- 2000
This work considers an alternative framework in which the representation of temporal context depends on learned, recurrent connections within a network that maps from environmental inputs to actions, and indicates that recurrent connectionist models offer a useful framework for understanding routine sequential action.
Latent Attractors: A General Paradigm for Context-Dependent Neural Computation
- Psychology, Computer ScienceTrends in Neural Computation
- 2007
This chapter describes an approach called latent attractors that allows self-organizing neural systems to simultaneously incorporate both Type I and Type II context dependency, and argues that the latent attractor approach is a general and flexible method for incorporating multi-scale temporal dependence into neural systems, and possibly other self-organized systems.
Finding event structure in time: What recurrent neural networks can tell us about event structure in mind
- Computer ScienceCognition
- 2021
What connectionist models learn: Learning and representation in connectionist networks
- Biology, PsychologyBehavioral and Brain Sciences
- 1990
It is shown that connectionist models can be used to explore systematically the complex interaction between learning and representation, as it is demonstrated through the analysis of several large networks.
Representation of Temporal Patternsin Recurrent Networks
- Computer Science
- 1993
This paper investigates the way temporal patterns are represented by recurrent networks by establishing the relationship between the temporal characteristics of the training set and the representations developed by the network.
Recurrent Neural Networks for Temporal Sequences Recognition
- Computer Science
- 1993
This report presents various tasks that are based on temporal pattern processing and the different neural network architectures, simulated to tackle the problem.
Toward a connectionist model of recursion in human linguistic performance
- Computer Science
- 1999
Toward a connectionist model of recursion in human linguistic performance
- Computer ScienceCogn. Sci.
- 1999
This work suggests a novel explanation of people’s limited recursive performance, without assuming the existence of a mentally represented competence grammar allowing unbounded recursion.
The representation of structure in sequence prediction tasks
- Psychology
- 1994
Simulation studies in which connectionist networks are trained to predict the last event of the sequences in the same conditions as subjects were are presented, and it is suggested that the kind of representations developed by connectionist models are intermediate between abstract representations and exemplar-based representations, and that these two extreme forms of representation are points on a continuum.
References
SHOWING 1-10 OF 111 REFERENCES
Learning Subsequential Structure in Simple Recurrent Networks
- Computer ScienceNIPS
- 1988
A network architecture introduced by Elman (1988) for predicting successive elements of a sequence using the pattern of activation over a set of hidden units to be illustrated with cluster analyses performed at different points during training.
On the proper treatment of connectionism
- PsychologyBehavioral and Brain Sciences
- 1988
Abstract A set of hypotheses is formulated for a connectionist approach to cognitive modeling. These hypotheses are shown to be incompatible with the hypotheses underlying traditional cognitive…
Distributed Representations
- Computer ScienceThe Philosophy of Artificial Intelligence
- 1990
This report describes a different type of representation that is less familiar and harder to think about than local representations, which makes use of the processing abilities of networks of simple, neuron-like computing elements.
Parallel Networks that Learn to Pronounce
- Computer Science
- 1987
H hierarchical clustering techniques applied to NETtalk suggest that these different networks have similar internal representations of letter-to-sound correspondences within groups of processing units, which suggests that invariant internal representations may be found in assemblies of neurons intermediate in size between highly localized and completely distributed representations.
Parallel Networks that Learn to Pronounce English Text
- Computer ScienceComplex Syst.
- 1987
H hierarchical clustering techniques applied to NETtalk reveal that these different networks have similar internal representations of letter-to-sound correspondences within groups of processing units, which suggests that invariant internal representations may be found in assemblies of neurons intermediate in size between highly localized and completely distributed representations.
A Dynamical Approach to Temporal Pattern Processing
- Computer ScienceNIPS
- 1987
This work proposes an architecture in which time serves as its own representation, and temporal context is encoded in the state of the nodes, and contrasts this with the approach of replicating portions of the architecture to represent time.
Neural computation by concentrating information in time.
- Computer ScienceProceedings of the National Academy of Sciences of the United States of America
- 1987
An analog model neural network that can solve a general problem of recognizing patterns in a time-dependent signal is presented and can be understood from consideration of an energy function that is being minimized as the circuit computes.
Learning the hidden structure of speech.
- Computer ScienceThe Journal of the Acoustical Society of America
- 1988
The results of these studies demonstrate that backpropagation learning can be used with complex, natural data to identify a feature structure that can serve as the basis for both analysis and nontrivial pattern recognition.
Spoken word recognition processes and the gating paradigm
- PsychologyPerception & psychophysics
- 1980
Words varying in length (one, two, and three syllables) and in frequency (high and low) were presented to subjects in isolation, in a short context, and in a long context to study more closely the narrowing-in process employed by listeners in the isolation and recognition of words.