Encoding sensory and motor patterns as time-invariant trajectories in recurrent neural networks

@article{Goudar2017EncodingSA,
  title={Encoding sensory and motor patterns as time-invariant trajectories in recurrent neural networks},
  author={Vishwa Goudar and Dean V. Buonomano},
  journal={eLife},
  year={2017},
  volume={7}
}
Much of the information the brain processes and stores is temporal in nature—a spoken word or a handwritten signature, for example, is defined by how it unfolds in time. However, it remains unclear how neural circuits encode complex time-varying patterns. We show that by tuning the weights of a recurrent neural network (RNN), it can recognize and then transcribe spoken digits. The model elucidates how neural dynamics in cortical networks may resolve three fundamental challenges: first, encode… 
Learning Long Temporal Sequences in Spiking Networks by Multiplexing Neural Oscillations
TLDR
It is suggested that combining oscillatory neuronal inputs with different frequencies provides a key mechanism to generate precisely timed sequences of activity in recurrent circuits of the brain.
Encoding time in neural dynamic regimes with distinct computational tradeoffs
TLDR
The results predict that apparently similar neural dynamic patterns at the population level can exhibit fundamentally different computational properties in regards to their ability to generalize to novel stimuli and their robustness to noise—and that these differences are associated with differences in network connectivity and distinct contributions of excitatory and inhibitory neurons.
A neural circuit model for human sensorimotor timing
TLDR
A circuit-level model is developed that provides insight into how the brain coordinates movement times with expected and unexpected temporal events in the domain of sensorimotor timing and shows how recurrent interactions in a simple and modular neural circuit could create the dynamics needed to control temporal aspects of behavior.
Coding with transient trajectories in recurrent neural networks
TLDR
This work examines transient coding in a broad class of high-dimensional linear networks of recurrently connected units and builds minimal, low-rank networks that robustly implement trajectories mapping a specific input onto a specific orthogonal output state.
A model of temporal scaling correctly predicts that motor timing improves with speed
TLDR
It is shown that a recurrent neural network can be trained to exhibit temporal scaling obeying Weber’s law as well as validate a prediction of the model of improved precision of movements at faster speeds and suggest the Weber-Speed effect.
Stimulus-Driven and Spontaneous Dynamics in Excitatory-Inhibitory Recurrent Neural Networks for Sequence Representation
TLDR
A general framework for understanding neural sequence representation in the excitatory-inhibitory RNN is provided and the stability of dynamic attractors while training the RNN to learn two sequences is examined.
Understanding the computation of time using neural network models
TLDR
It is found that neural networks perceive time through state evolution along stereotypical trajectories and produce time intervals by scaling evolution speed and four factors that facilitate strong temporal signals in nontiming tasks, including the anticipation of coming events are identified.
Thunderstruck: The ACDC model of flexible sequences and rhythms in recurrent neural circuits
TLDR
A biologically plausible recurrent neural network of cortical dynamics is augmented to include a basal ganglia-thalamic module which uses reinforcement learning to dynamically modulate action and this “associative cluster-dependent chain” (ACDC) model modularly stores sequence and timing information in distinct loci of the network.
Thunderstruck: The ACDC model of flexible sequences and rhythms in recurrent neural circuits
TLDR
A biologically plausible recurrent neural network of cortical dynamics is augmented to include a basal ganglia-thalamic module which uses reinforcement learning to dynamically modulate action and this “associative cluster-dependent chain” (ACDC) model modularly stores sequence and timing information in distinct loci of the network.
Time representation in neural network models trained to perform interval timing tasks
TLDR
Fundamental principles of the neuronal coding of time that supports the brain for flexible temporal processing are disclosed and facilitate generalizable decoding of time and non-time information.
...
...

References

SHOWING 1-10 OF 81 REFERENCES
ROBUST TIMING AND MOTOR PATTERNS BY TAMING CHAOS IN RECURRENT NEURAL NETWORKS
TLDR
A firing rate model is developed that tells time on the order of seconds and generates complex spatiotemporal patterns in the presence of high levels of noise and provides a feature that is characteristic of biological systems: the ability to 'return' to the pattern being generated in the face of perturbations.
Learning multiple variable-speed sequences in striatum via cortical tutoring
TLDR
A model of the striatum is constructed, an all-inhibitory circuit where sequential activity patterns are prominent, addressing the following key challenges: obtaining control over temporal rescaling of the sequence speed, with the ability to generalize to new speeds.
Complexity without chaos: Plasticity within random recurrent networks generates robust timing and motor control
TLDR
This work demonstrates how random recurrent networks that initially exhibit chaotic dynamics can be tuned through a supervised learning rule to generate locally stable neural patterns of activity that are both complex and robust to noise.
Robust neuronal dynamics in premotor cortex during motor planning
TLDR
It is shown that preparatory activity is remarkably robust to large-scale unilateral silencing: detailed neural dynamics that drive specific future movements were quickly and selectively restored by the network.
The neural basis of temporal processing.
TLDR
It is suggested that, given the intricate link between temporal and spatial information in most sensory and motor tasks, timing and spatial processing are intrinsic properties of neural function, and specialized timing mechanisms such as delay lines, oscillators, or a spectrum of different time constants are not required.
Recurrent Network Models of Sequence Generation and Memory
Temporal scaling of neural responses to compressed and dilated natural speech.
TLDR
The data suggest that 1) the rate of neural information processing can be rescaled according to the rates of incoming information, both in early sensory regions as well as in higher order cortexes, and 2) the rescaling of neural dynamics is confined to a range of rates that match the range of behavioral performance.
A Scalable Population Code for Time in the Striatum
Randomly Connected Networks Have Short Temporal Memory
TLDR
It is shown that when connectivity is high, as it is in the mammalian brain, randomly connected networks cannot exhibit temporal memory much longer than the time constants of their constituent neurons.
Understanding Emergent Dynamics: Using a Collective Activity Coordinate of a Neural Network to Recognize Time-Varying Patterns
TLDR
How the emergent computational dynamics of a biologically based neural network generates a robust natural solution to the problem of categorizing time-varying stimulus patterns such as spoken words or animal stereotypical behaviors is described.
...
...