Finding Structure in Time

@article{Elman1990FindingSI,
  title={Finding Structure in Time},
  author={Jeffrey L. Elman},
  journal={Cogn. Sci.},
  year={1990},
  volume={14},
  pages={179-211}
}
  • J. Elman
  • Published 1 March 1990
  • Computer Science
  • Cogn. Sci.
Time underlies many interesting human behaviors. [...] Key Method In this approach, hidden unit patterns are fed back to themselves; the internal representations which develop thus reflect task demands in the context of prior internal states. A set of simulations is reported which range from relatively simple problems (temporal version of XOR) to discovering syntactic/semantic features for words. The networks are able to learn interesting internal representations which incorporate task demands with memory…Expand
Incremental sequence learning
TLDR
An alternative model based on Maskara & Noetzel's (1991) Auto-Associative Recurrent Network is suggested as a way to overcome the SRN model’s failure to account for human performance in several experimental situations meant to test the model's specific predictions.
Doing Without Schema Hierarchies : A Recurrent Connectionist Approach to Routine Sequential Action and Its Pathologies
In everyday tasks, selecting actions in the proper sequence requires a continuously updated representation of temporal context. Many existing models address this problem by positing a hierarchy of
Latent Attractors: A General Paradigm for Context-Dependent Neural Computation
TLDR
This chapter describes an approach called latent attractors that allows self-organizing neural systems to simultaneously incorporate both Type I and Type II context dependency, and argues that the latent attractor approach is a general and flexible method for incorporating multi-scale temporal dependence into neural systems, and possibly other self-organized systems.
Representation of Temporal Patternsin Recurrent Networks
In order to determine the manner in which temporal patterns are represented in recurrent neural networks, networks trained on a variety of sequence recognition tasks are examined. Analysis of the
Recurrent Neural Networks for Temporal Sequences Recognition
Time is the center of many human tasks. To talk, to listen, to read or to write are examples of time related tasks. To integrate the time notion into neural network is very important in order to deal
Toward a connectionist model of recursion in human linguistic performance
TLDR
This work suggests a novel explanation of people's limited recursive performance, without assuming the existence of a mentally represented competence grammar allowing unbounded recursion.
Toward a connectionist model of recursion in human linguistic performance
TLDR
This work suggests a novel explanation of people’s limited recursive performance, without assuming the existence of a mentally represented competence grammar allowing unbounded recursion.
The representation of structure in sequence prediction tasks
TLDR
Simulation studies in which connectionist networks are trained to predict the last event of the sequences in the same conditions as subjects were are presented, and it is suggested that the kind of representations developed by connectionist models are intermediate between abstract representations and exemplar-based representations, and that these two extreme forms of representation are points on a continuum.
Currents in connectionism
TLDR
Four significant advances on the feedforward architecture of connectionism significantly increase the usefulness of connectionist networks for modeling human cognitive performance by providing tools for explaining the productivity and systematicity of some mental activities, and developing representations that are sensitive to the content they are to represent.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 110 REFERENCES
Learning Subsequential Structure in Simple Recurrent Networks
TLDR
A network architecture introduced by Elman (1988) for predicting successive elements of a sequence using the pattern of activation over a set of hidden units to be illustrated with cluster analyses performed at different points during training.
Distributed Representations
TLDR
This report describes a different type of representation that is less familiar and harder to think about than local representations, which makes use of the processing abilities of networks of simple, neuron-like computing elements.
Parallel Networks that Learn to Pronounce
TLDR
H hierarchical clustering techniques applied to NETtalk suggest that these different networks have similar internal representations of letter-to-sound correspondences within groups of processing units, which suggests that invariant internal representations may be found in assemblies of neurons intermediate in size between highly localized and completely distributed representations.
Parallel Networks that Learn to Pronounce English Text
TLDR
H hierarchical clustering techniques applied to NETtalk reveal that these different networks have similar internal representations of letter-to-sound correspondences within groups of processing units, which suggests that invariant internal representations may be found in assemblies of neurons intermediate in size between highly localized and completely distributed representations.
A Dynamical Approach to Temporal Pattern Processing
TLDR
This work proposes an architecture in which time serves as its own representation, and temporal context is encoded in the state of the nodes, and contrasts this with the approach of replicating portions of the architecture to represent time.
Neural computation by concentrating information in time.
  • D. Tank, J. Hopfield
  • Computer Science, Medicine
    Proceedings of the National Academy of Sciences of the United States of America
  • 1987
TLDR
An analog model neural network that can solve a general problem of recognizing patterns in a time-dependent signal is presented and can be understood from consideration of an energy function that is being minimized as the circuit computes.
The temporal structure of spoken language understanding
TLDR
The combined results provided evidence for an on-line interactive language processing theory, in which lexical, structural, and interpretative knowledge sources communicate and interact during processing in an optimally efficient and accurate manner.
Learning the hidden structure of speech.
  • J. Elman, D. Zipser
  • Computer Science, Medicine
    The Journal of the Acoustical Society of America
  • 1988
TLDR
The results of these studies demonstrate that backpropagation learning can be used with complex, natural data to identify a feature structure that can serve as the basis for both analysis and nontrivial pattern recognition.
Spoken word recognition processes and the gating paradigm
  • F. Grosjean
  • Computer Science, Medicine
    Perception & psychophysics
  • 1980
TLDR
Words varying in length (one, two, and three syllables) and in frequency (high and low) were presented to subjects in isolation, in a short context, and in a long context to study more closely the narrowing-in process employed by listeners in the isolation and recognition of words.
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
The fundamental principles, basic mechanisms, and formal analyses involved in the development of parallel distributed processing (PDP) systems are presented in individual chapters contributed by
...
1
2
3
4
5
...