Statistical mechanics of phase space partitioning in large-scale spiking neuron circuits

  title={Statistical mechanics of phase space partitioning in large-scale spiking neuron circuits},
  author={Maximilian Puelma Touzel and Fred Wolf},
Synaptic interactions structure the phase space of the dynamics of neural circuits and constrain neural computation. Understanding how requires methods that handle those discrete interactions, yet few exist. Recently, it was discovered that even random networks exhibit dynamics that partitions the phase space into numerous attractor basins. Here we utilize this phenomenon to develop theory for the geometry of phase space partitioning in spiking neural circuits. We find basin boundaries… 


Asynchronous Rate Chaos in Spiking Neuronal Circuits
This work investigates the dynamics of networks of excitatory-inhibitory spiking neurons with random sparse connectivity operating in the regime of balance of excitation and inhibition and shows that chaotic, asynchronous firing rate fluctuations emerge generically for sufficiently strong synapses.
Chaotic Dynamics in Networks of Spiking Neurons in the Balanced State
It is demonstrated that neural networks in the balanced state appear to generally exhibit chaotic dynamics, and a novel approach was introduced to thoroughly characterize neural network dynamics and quantify information preservation and erasure.
Transition to chaos in random neuronal networks
Firing patterns in the central nervous system often exhibit strong temporal irregularity and heterogeneity in their time averaged response properties. Previous studies suggested that these properties
Statistical structure of neural spiking under non-Poissonian or other non-white stimulation
The results provide a framework for the interpretation of firing statistics measured in vivo in the brain by derived simple formulas for the essential interspike-interval statistics for a canonical model of a tonically firing neuron subjected to arbitrarily correlated input from the network.
Commentary on Structured chaos shapes spike-response noise entropy in balanced neural networks, by Lajoie, Thivierge, and Shea-Brown
  • P. Thomas
  • Computer Science
    Front. Comput. Neurosci.
  • 2015
The authors formulating a computationally tractable upper bound on the spike-train noise entropy, building on Monteforte and Wolf (2010) and Lajoie et al. (2013), which shows convincingly that the KS entropy of the spike trains is roughly an order of magnitude smaller than what one would expect from a naive estimate based on the single-cell noise entropy.
Balanced Networks of Spiking Neurons with Spatially Dependent Recurrent Connections
The analysis of balanced networks is extended to include the known dependence of connection probability on the spatial separation between neurons, and it is derived that stable, balanced firing rate solutions require that the spatial spread of external inputs be broader than that of recurrent excitation.
The Asynchronous State in Cortical Circuits
It is shown theoretically that recurrent neural networks can generate an asynchronous state characterized by arbitrarily low mean spiking correlations despite substantial amounts of shared input, which generates negative correlations in synaptic currents which cancel the effect of sharedinput.
Synaptic scaling rule preserves excitatory–inhibitory balance and salient neuronal network dynamics
It is shown that synaptic strength scales with the number of connections K as ∼, close to the ideal theoretical value, suggesting that the synaptic scaling rule and resultant dynamics are emergent properties of networks in general.
Noise dynamically suppresses chaos in neural networks
Noise is ubiquitous in neural systems due to intrinsic stochasticity or external drive. For deterministic dynamics, neural networks of randomly coupled units display a transition to chaos at a
Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons
It is shown that an unstructured, sparsely connected network of model spiking neurons can display two fundamentally different types of asynchronous activity that imply vastly different computational properties.