Transition to chaos in random neuronal networks

@article{Kadmon2015TransitionTC,
  title={Transition to chaos in random neuronal networks},
  author={Jonathan Kadmon and Haim Sompolinsky},
  journal={arXiv: Disordered Systems and Neural Networks},
  year={2015}
}
Firing patterns in the central nervous system often exhibit strong temporal irregularity and heterogeneity in their time averaged response properties. Previous studies suggested that these properties are outcome of an intrinsic chaotic dynamics. Indeed, simplified rate-based large neuronal networks with random synaptic connections are known to exhibit sharp transition from fixed point to chaotic dynamics when the synaptic gain is increased. However, the existence of a similar transition in… 
How single neuron properties shape chaotic dynamics and signal transmission in random neural networks
TLDR
A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons, characterized by robust, narrow-band stochastic oscillations.
Is chaos making a difference? Synchronization transitions in chaotic and nonchaotic neuronal networks
TLDR
Transition dynamics of a medium-sized heterogeneous neural network of neurons connected by electrical coupling in a small world topology is investigated and it is shown that chaotic nodes can promote what is known as the multi-stable behavior, where the network dynamically switches between a number of different semi-synchronized, metastable states.
Transient Chaotic Dimensionality Expansion by Recurrent Networks
TLDR
It is shown that microscopic chaos rapidly expands the dimensionality of the representation while the number of dimensions corrupted by noise lags behind, which translates to a transient peak in the networks' classification performance even deeply in the chaotic regime, challenging the view that computational performance is always optimal near the edge of chaos.
Chaos and Correlated Avalanches in Excitatory Neural Networks with Synaptic Plasticity.
TLDR
This analysis reveals a mechanism for the generation of irregular avalanches that emerges from the combination of disorder and deterministic underlying chaotic dynamics.
Coherent chaos in a recurrent neural network with structured connectivity
TLDR
A simple model for coherent, spatially correlated chaos in a recurrent neural network that examines the effects of network-size scaling and shows how in this regime the dynamics depend qualitatively on the particular realization of the connectivity matrix: a complex leading eigenvalue can yield coherent oscillatory chaos while a real leading eridge can yield chaos with broken symmetry.
Optimal Sequence Memory in Driven Random Networks
TLDR
This work investigates the effect of a time-varying input on the onset of chaos and the resulting consequences for information processing, finding an exact condition that determines the transition from stable to chaotic dynamics and the sequential memory capacity in closed form.
Workshop Program Recent advances in recurrent network theory : fluctuating correlated dynamics across scales
TLDR
The dynamical properties of a continuous­time dynamic model of a fully connected network of N nonlinear elements interacting via random asymmetric couplings, described by a self­consistent dynamic mean­field theory, is discussed.
Chaotic dynamics in spatially distributed neuronal networks generate population-wide shared variability
TLDR
Chaos spatiotemporal chaos in cortical networks can explain the shared variability observed in neuronal population responses, and can be induced by globally correlated noisy inputs.
Input correlations impede suppression of chaos and learning in balanced rate networks
TLDR
A non-stationary dynamic mean-field theory is developed that determines how the activity statistics and largest Lyapunov exponent depend on frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input.
Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks
TLDR
This work extends an iterative single-neuron simulation scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure, and extends it to heterogeneous networks in which different neural subpopulations have different cellular or connectivity parameters.
...
...

References

SHOWING 1-10 OF 127 REFERENCES
Asynchronous Rate Chaos in Spiking Neuronal Circuits
TLDR
This work investigates the dynamics of networks of excitatory-inhibitory spiking neurons with random sparse connectivity operating in the regime of balance of excitation and inhibition and shows that chaotic, asynchronous firing rate fluctuations emerge generically for sufficiently strong synapses.
Single cell dynamics determine strength of chaos in collective network dynamics
TLDR
It is shown that increasing the AP onset rapidness of single neurons strongly reduces the intensity of chaos in balanced networks, and could even induce a transition to stable irregular dynamics.
Beyond the edge of chaos: amplification and temporal integration by recurrent networks in the chaotic regime.
TLDR
This study analytically evaluates how well a small external input can be reconstructed from a sparse linear readout of network activity, and shows that, near the edge, decoding performance is characterized by a critical exponent that takes a different value on the two sides.
Very long transients, irregular firing, and chaotic dynamics in networks of randomly connected inhibitory integrate-and-fire neurons.
TLDR
It is reported in this type of network that chaotic dynamics characterized by positive Lyapunov exponents can also be observed and it is shown that chaos occurs when the decay time of the synaptic currents is long compared to the synaptic delay, provided that the network is sufficiently large.
Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
TLDR
It is shown that only near the critical boundary can recurrent networks of threshold gates perform complex computations on time series, which strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos.
Stable irregular dynamics in complex neural networks.
TLDR
Analytically investigate the microscopic irregular dynamics in finite networks of arbitrary connectivity, keeping track of all individual spike times to highlight that chaotic and stable dynamics may be equally irregular.
Chaotic Balanced State in a Model of Cortical Circuits
TLDR
The chaotic nature of the balanced state of this network model is revealed by showing that the evolution of the microscopic state of the network is extremely sensitive to small deviations in its initial conditions.
Large Deviations, Dynamics and Phase Transitions in Large Stochastic and Disordered Neural Networks
TLDR
A dynamical systems approach is proposed in order to address the qualitative nature of the solutions of these very complex equations, and applies this methodology to three instances to show how non-centered coefficients, interaction delays and multiple populations networks are affected by disorder levels.
Stimulus-dependent suppression of chaos in recurrent neural networks.
TLDR
It is found that inputs not only drive network responses, but they also actively suppress ongoing activity, ultimately leading to a phase transition in which chaos is completely eliminated.
Chaos and synchrony in a model of a hypercolumn in visual cortex
TLDR
The results show that the cooperative dynamics of large neuronal networks are capable of generating variability and synchrony similar to those observed in cortex.
...
...