Asymptotic Description of Neural Networks with Correlated Synaptic Weights

@article{Faugeras2015AsymptoticDO,
  title={Asymptotic Description of Neural Networks with Correlated Synaptic Weights},
  author={Olivier D. Faugeras and James N. MacLaurin},
  journal={Entropy},
  year={2015},
  volume={17},
  pages={4701-4743}
}
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the… 
Stationary-State Statistics of a Binary Neural Network Model with Quenched Disorder
TLDR
By applying the Fisher-Tippett-Gnedenko theorem, asymptotic expressions of the stationary-state statistics of multi-population networks in the large-network-size limit are derived, in terms of the Gumbel (double exponential) distribution.
Large Deviations of a Spatially-Stationary Network of Interacting Neurons
TLDR
This work determines a process-level Large Deviation Principle (LDP) for a model of interacting neurons indexed by a lattice Z d, thereby presenting itself as a generalisation of traditional mean-field models.
Mean field system of a two-layers neural model in a diffusive regime
: We study a model of interacting neurons. The structure of this neural system is composed of two layers of neurons such that the neurons of the first layer send their spikes to the neurons of the
The Complexity of Dynamics in Small Neural Circuits
TLDR
This work develops a novel systematic analysis of the dynamics of arbitrarily small networks composed of homogeneous populations of excitatory and inhibitory firing-rate neurons, and reveals qualitative and previously unexplored differences between mesoscopic cortical circuits and their mean-field approximation.
Large Deviations of a Network of Neurons with Dynamic Sparse Random Connections
TLDR
This work synthesizes the theory of large-size limits of interacting particles with that of random graphs and matrices and should be relevant to neuroscience and social networks theory in particular.
Pattern Storage, Bifurcations and Higher-Order Correlation Structure of an Exactly Solvable Asymmetric Neural Network Model
TLDR
This article finds exact analytical solutions of an asymmetric spin-glass-like model of arbitrary size and performs a complete study of its dynamical and statistical properties, finding analytical expressions of the conditional and stationary joint probability distributions of the membrane potentials and the firing rates.
Pattern Storage, Bifurcations, and Groupwise Correlation Structure of an Exactly Solvable Asymmetric Neural Network Model
TLDR
By manipulating the conditional probability distribution of the firing rates, the associating learning rule previously introduced by Personnaz and coworkers is extended to stochastic networks and allowed the safe storage, under the presence of noise, of point and cyclic attractors, with useful implications for content-addressable memories.
Bifurcation Analysis of a Sparse Neural Network with Cubic Topology
TLDR
It is proved that, unlike the fully-connected model, in the cubic network the neural activity may undergo spontaneous symmetry-breaking even if the network is composed exclusively of excitatory neurons, and the sparseness of the synaptic connections may increase the complexity of dynamics compared to dense networks.
Large Deviations of a Network of Interacting Particles with Sparse Random Connections
TLDR
This work synthesizes the theory of large-size limits of interacting particles with that of random graphs and matrices and should be relevant to neuroscience and social networks theory in particular.
...
...

References

SHOWING 1-10 OF 65 REFERENCES
Large deviations and mean-field theory for asymmetric random recurrent neural networks
TLDR
This paper provides rigorous mean-field results for a large class of neural networks which is currently investigated in neural network literature and shows that many classical distributions on the couplings fulfill the general condition.
Random recurrent neural networks dynamics
Abstract.This paper is a review dealing with the study of large size random recurrent neural networks. The connection weights are varying according to a probability law and it is possible to predict
Theory of correlations in stochastic neural networks.
  • Ginzburg, Sompolinsky
  • Computer Science
    Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
  • 1994
TLDR
The theory of neuronal correlation functions in large networks comprising of several highly connected subpopulations, and obeying stochastic dynamic rules is developed and extended to networks with random connectivity, such as randomly dilute networks.
Mean-field description and propagation of chaos in networks of Hodgkin-Huxley and FitzHugh-Nagumo neurons
TLDR
It is proved that a propagation of chaos phenomenon takes place, namely that in the mean-field limit, any finite number of neurons become independent and, within each population, have the same probability distribution.
Stochastic Neural Field Theory and the System-Size Expansion
TLDR
This work analyzes a master equation formulation of stochastic neurodynamics for a network of synaptically coupled homogeneous neuronal populations each consisting of N identical neurons to derive the lowest order corrections to these rate equations for large but finite N.
From neuron to neural networks dynamics
TLDR
An overview of some techniques and concepts coming from dynamical system theory and used for the analysis of dynamical neural networks models and various techniques coming from statistical physics and dynamical systems theory are introduced.
Time structure of the activity in neural network models.
  • Gerstner
  • Computer Science
    Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
  • 1995
TLDR
It is shown that the stationary state of noiseless systems is ``almost always'' unstable, and the theory allows an estimation of the errors introduced in firing rate or ``graded-response'' models.
Systematic Fluctuation Expansion for Neural Network Activity Equations
TLDR
This work describes how a stochastic theory of neural networks that includes statistics at all orders yields a systematic extension to population rate equations by introducing equations for correlations and appropriate coupling terms.
Field-theoretic approach to fluctuation effects in neural networks.
  • M. Buice, J. Cowan
  • Physics
    Physical review. E, Statistical, nonlinear, and soft matter physics
  • 2007
TLDR
The effective spike model is constructed, which describes both neural fluctuations and response and is argued that neural activity governed by this model exhibits a dynamical phase transition which is in the universality class of directed percolation.
Analysis of nonlinear noisy integrate & fire neuron models: blow-up and steady states
TLDR
The results show how critical is the balance between noise and excitatory/inhibitory interactions to the connectivity parameter and several aspects of the NNLIF model are analysed: the number of steady states, a priori estimates, blow-up issues and convergence toward equilibrium in the linear case.
...
...