Transition to chaos in random networks with cell-type-specific connectivity.

@article{Aljadeff2015TransitionTC,
  title={Transition to chaos in random networks with cell-type-specific connectivity.},
  author={Johnatan Aljadeff and Merav Stern and Tatyana O. Sharpee},
  journal={Physical review letters},
  year={2015},
  volume={114 8},
  pages={
          088101
        }
}
In neural circuits, statistical connectivity rules strongly depend on cell-type identity. We study dynamics of neural networks with cell-type-specific connectivity by extending the dynamic mean-field method and find that these networks exhibit a phase transition between silent and chaotic activity. By analyzing the locus of this transition, we derive a new result in random matrix theory: the spectral radius of a random connectivity matrix with block-structured variances. We apply our results to… 

Figures from this paper

Transition to chaos in random neuronal networks
Firing patterns in the central nervous system often exhibit strong temporal irregularity and heterogeneity in their time averaged response properties. Previous studies suggested that these properties
Edge of Chaos and Avalanches in Neural Networks with Heavy-Tailed Synaptic Weight Distribution.
TLDR
This work proposes an analytically tractable neural connectivity model with power-law distributed synaptic strengths that features a continuous transition to chaos and suggests that heavy-tailed synaptic distribution may form a weakly informative sparse-connectivity prior that can be useful in biological and artificial adaptive systems.
Consequences of Dale's law on the stability-complexity relationship of random neural networks.
TLDR
This paper parametrize the network structure according to Dale's law and use the Kac-Rice formalism to compute the change in the number of equilibria when a phase transition occurs, which enables the effects of different heterogeneous network connectivities on brain state transitions, which can provide insights into pathological brain dynamics.
Optimal Sequence Memory in Driven Random Networks
TLDR
This work investigates the effect of a time-varying input on the onset of chaos and the resulting consequences for information processing, finding an exact condition that determines the transition from stable to chaotic dynamics and the sequential memory capacity in closed form.
Correlations between synapses in pairs of neurons slow down dynamics in randomly connected neural networks.
TLDR
This work investigates the effects of partially symmetric connectivity on the dynamics in networks of rate units, and considers the two dynamical regimes exhibited by random neural networks: the weak-coupling regime, where the firing activity decays to a single fixed point unless the network is stimulated, and the strong-Coupling or chaotic regime, characterized by internally generated fluctuating firing rates.
Transient Chaotic Dimensionality Expansion by Recurrent Networks
TLDR
It is shown that microscopic chaos rapidly expands the dimensionality of the representation while the number of dimensions corrupted by noise lags behind, which translates to a transient peak in the networks' classification performance even deeply in the chaotic regime, challenging the view that computational performance is always optimal near the edge of chaos.
Bifurcations of a neural network model with symmetry
We analyze a family of clustered excitatory-inhibitory neural networks and the underlying bifurcation structures that arise because of permutation symmetries in the network as the global coupling
Symmetries Constrain Dynamics in a Family of Balanced Neural Networks
TLDR
This work examines a family of random firing-rate neural networks in which each neuron makes either excitatory or inhibitory connections onto its post-synaptic targets and finds that this constrained system may be described as a perturbation from a system with nontrivial symmetries.
Linking structure and activity in nonlinear spiking networks
TLDR
This work presents a new relationship between connectivity and activity in networks of nonlinear spiking neurons by developing a diagrammatic fluctuation expansion based on statistical field theory and explicitly shows how recurrent network structure produces pairwise and higher-order correlated activity, and how nonlinearities impact the networks' spiking activity.
Dynamics of random recurrent networks with correlated low-rank structure
TLDR
An analytic framework is developed to establish the precise effect of the correlations on the eigenvalue spectrum of the joint connectivity and elucidates how correlations allow structured and random connectivity to synergistically extend the range of computations available to networks.
...
...

References

SHOWING 1-10 OF 67 REFERENCES
Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
TLDR
It is shown that only near the critical boundary can recurrent networks of threshold gates perform complex computations on time series, which strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos.
Large Deviations, Dynamics and Phase Transitions in Large Stochastic and Disordered Neural Networks
TLDR
A dynamical systems approach is proposed in order to address the qualitative nature of the solutions of these very complex equations, and applies this methodology to three instances to show how non-centered coefficients, interaction delays and multiple populations networks are affected by disorder levels.
Beyond the edge of chaos: amplification and temporal integration by recurrent networks in the chaotic regime.
TLDR
This study analytically evaluates how well a small external input can be reconstructed from a sparse linear readout of network activity, and shows that, near the edge, decoding performance is characterized by a critical exponent that takes a different value on the two sides.
Suppressing chaos in neural networks by noise.
TLDR
Using dynamical mean-field equations, the activity and the maximal Lyapunov exponent of the network in dependence of a nonlinearity (gain) parameter and the noise intensity are calculated.
Chaos in random neural networks.
TLDR
A self-consistent mean-field theory predicts a transition from a stationary phase to a chaotic phase occurring at a critical value of the gain parameter.
Eigenvalue spectra of random matrices for neural networks.
TLDR
Eigenvalue spectra of large random matrices with excitatory and inhibitory columns drawn from distributions with different means and equal or different variances are computed.
Characteristics of Random Nets of Analog Neuron-Like Elements
TLDR
The dynamic behavior of randomly connected analog neuron-like elements that process pulse-frequency modulated signals is investigated from the macroscopic point of view and it is shown that a stable oscillation exists in such a net?in contrast with the fact that no stable oscillations exist in a net of statistically symmetric structure.
Stimulus-dependent suppression of chaos in recurrent neural networks.
TLDR
It is found that inputs not only drive network responses, but they also actively suppress ongoing activity, ultimately leading to a phase transition in which chaos is completely eliminated.
The effect of network topology on the stability of discrete state models of genetic control
TLDR
A general method for determining the stability of large Boolean networks of any specified network topology and predicting their steady-state behavior in response to small perturbations is presented.
Eukaryotic cells are dynamically ordered or critical but not chaotic.
TLDR
Using the Boolean approach, this work shows what it believes to be the first direct evidence that the underlying genetic network of HeLa cells appears to operate either in the ordered regime or at the border between order and chaos but does not appear to be chaotic.
...
...