Complexity without chaos: Plasticity within random recurrent networks generates robust timing and motor control

@inproceedings{Laje2012ComplexityWC,
  title={Complexity without chaos: Plasticity within random recurrent networks generates robust timing and motor control},
  author={Rodrigo Laje and Dean V. Buonomano},
  year={2012}
}
It is widely accepted that the complex dynamics characteristic of recurrent neural circuits contributes in a fundamental manner to brain function. Progress has been slow in understanding and exploiting the computational power of recurrent dynamics for two main reasons: nonlinear recurrent networks often exhibit chaotic behavior and most known learning rules do not work in robust fashion in recurrent networks. Here we address both these problems by demonstrating how random recurrent networks… 

How single neuron properties shape chaotic dynamics and signal transmission in random neural networks

A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons, characterized by robust, narrow-band stochastic oscillations.

Encoding in Balanced Networks: Revisiting Spike Patterns and Chaos in Stimulus-Driven Systems

It is shown that strongly chaotic networks produce patterned spikes that reliably encode time-dependent stimuli: using a decoder sensitive to spike times on timescales of 10’s of ms, one can easily distinguish responses to very similar inputs.

Driving reservoir models with oscillations: a solution to the extreme structural sensitivity of chaotic networks

This work introduces a new architecture where the reservoir is driven by a layer of oscillators that generate stable and repeatable trajectories that serve as a collection of inputs from which a network can robustly generate complex dynamics and implement rich computations.

Learning recurrent dynamics in spiking networks

Modifying the recurrent connectivity with a recursive least squares algorithm provides sufficient flexibility for synaptic and spiking rate dynamics of spiking networks to produce a wide range of spatiotemporal activity.

An Investigation of the Dynamical Transitions in Harmonically Driven Random Networks of Firing-Rate Neurons

Changes in the dimensionality of a network’s dynamics are explained in terms of changes in the underlying structure of its vector field by analysing stationary points, and the coexistence of underlying attractors with various geometric forms in unstable networks is uncovered.

Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks

It is shown that a biologically plausible learning rule can train recurrent neural networks, guided solely by delayed, phasic rewards at the end of each trial, and offered a plausible model of cortical dynamics during both learning and performance of flexible behavior.

Achieving stable dynamics in neural circuits

Mechanisms for achieving stability in multiple connected networks with biologically realistic dynamics, including synaptic plasticity and time-varying inputs are found, shedding light on how stable computations might be achieved despite biological complexity.

Fast and flexible sequence induction in spiking neural networks via rapid excitability changes

A simple yet previously unexplored combination of biological mechanisms that converge in hippocampus and suffice for fast and flexible reconfiguration of sequential network dynamics are elucidated, suggesting their potential role in cognitive flexibility over rapid timescales.

Coding with transient trajectories in recurrent neural networks

This work examines transient coding in a broad class of high-dimensional linear networks of recurrently connected units and builds minimal, low-rank networks that robustly implement trajectories mapping a specific input onto a specific orthogonal output state.

Multiple-Timescale Neural Networks: Generation of History-Dependent Sequences and Inference Through Autonomous Bifurcations

This study develops a neural network with fast and slow dynamics, which is inspired by the hierarchy of timescales on neural activities in the cortex and finds that timescale balance is critical to the co-existence period.
...

References

SHOWING 1-10 OF 64 REFERENCES

A learning rule for the emergence of stable dynamics and timing in recurrent networks.

  • D. Buonomano
  • Biology, Computer Science
    Journal of neurophysiology
  • 2005
A neural network model is used to examine the emergence of stable dynamical states within recurrent networks and establishes a learning rule by which cortical networks can potentially process temporal information in a self-organizing manner, in the absence of specialized timing mechanisms.

Embedding Multiple Trajectories in Simulated Recurrent Neural Networks in a Self-Organizing Manner

One of the first learning rules that can embed multiple trajectories, each of which recruits all neurons, within recurrent neural networks in a self-organizing manner is established.

Stimulus-dependent suppression of chaos in recurrent neural networks.

It is found that inputs not only drive network responses, but they also actively suppress ongoing activity, ultimately leading to a phase transition in which chaos is completely eliminated.

Generating Coherent Patterns of Activity from Chaotic Neural Networks

Chaos in Neuronal Networks with Balanced Excitatory and Inhibitory Activity

The hypothesis that the temporal variability in the firing of a neuron results from an approximate balance between its excitatory and inhibitory inputs was investigated theoretically.

Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.

Computational significance of transient dynamics in cortical networks

It is argued that there are many situations in which the transient neural behaviour, while hopping between different attractor states or moving along ‘attractor ruins’, carries most of the computational and/or behavioural significance, rather than the attractorStates eventually reached.

Very long transients, irregular firing, and chaotic dynamics in networks of randomly connected inhibitory integrate-and-fire neurons.

It is reported in this type of network that chaotic dynamics characterized by positive Lyapunov exponents can also be observed and it is shown that chaos occurs when the decay time of the synaptic currents is long compared to the synaptic delay, provided that the network is sufficiently large.

Dynamical Constraints on Using Precise Spike Timing to Compute in Recurrent Cortical Networks

It is concluded that under conditions of sustained, Poisson-like, weakly correlated, low to moderate levels of internal activity as found in the cortex, it is unlikely that recurrent cortical networks can robustly generate precise spike trajectories, that is, spatiotemporal patterns of spikes precise to the millisecond timescale.

Dynamics of networks of randomly connected excitatory and inhibitory spiking neurons

  • N. Brunel
  • Psychology
    Journal of Physiology-Paris
  • 2000
...