From neuron to neural networks dynamics

@article{Cessac2006FromNT,
  title={From neuron to neural networks dynamics},
  author={Bruno Cessac and Manuel Samuelides},
  journal={The European Physical Journal Special Topics},
  year={2006},
  volume={142},
  pages={7-88}
}
Abstract.This paper presents an overview of some techniques and concepts coming from dynamical system theory and used for the analysis of dynamical neural networks models. In a first section, we describe the dynamics of the neuron, starting from the Hodgkin-Huxley description, which is somehow the canonical description for the “biological neuron”. We discuss some models reducing the Hodgkin-Huxley model to a two dimensional dynamical system, keeping one of the main feature of the neuron: its… 

Random recurrent neural networks dynamics

Abstract.This paper is a review dealing with the study of large size random recurrent neural networks. The connection weights are varying according to a probability law and it is possible to predict

On Dynamics of Integrate-and-Fire Neural Networks with Conductance Based Synapses

TLDR
A model where spikes are effective at times multiple of a characteristic time scale δ, where δ can be arbitrary small (in particular, well beyond the numerical precision) is proposed, providing a relevant characterization of the computational capabilities of the network.

A Mathematical Analysis of the Effects of Hebbian Learning Rules on the Dynamics and Structure of Discrete-Time Random Recurrent Neural Networks

TLDR
It is shown that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0.1, and how neural networks may take advantage of this regime of high functional interest is discussed.

Are neuronal networks that vicious ? Or only their models ?

TLDR
The present study considers networks with constant input, and without time-dependent plasticity, but the framework has been designed for both extensions, and introduces an order parameter, easy to compute numerically, and closely related to a natural notion of entropy providing a relevant characterization of the computational capabilities of the network.

Control analysis of the action potential and its propagation in the Hodgkin-Huxley model

TLDR
This study tested the feasibility of an MCA approach to analyse the Hodgkin-Huxley model, identifying all the discernable model processes of the neuronal system and developing a method to quantify them.

A large deviation principle for networks of rate neurons with correlated synaptic weights

TLDR
This work develops a model of neural networks with inhomogeneous weights between the neurons, and analyzes its behavior as the number of neurons asymptotes to infinity, suggesting that the mean-field behavior is insufficient to characterize the behavior of a population.

Large deviations for randomly connected neural networks: I. Spatially extended systems

Abstract In a series of two papers, we investigate the large deviations and asymptotic behavior of stochastic models of brain neural networks with random interaction coefficients. In this first

Asymptotic Description of Neural Networks with Correlated Synaptic Weights

TLDR
The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum.

A discrete time neural network model with spiking neurons

  • B. Cessac
  • Computer Science
    Journal of mathematical biology
  • 2008
TLDR
Though the dynamics of membrane potential is generically periodic, it has a weak form of initial conditions sensitivity due to the presence of a sharp threshold in the model definition, and exhibits a dynamical regime indistinguishable from chaos in numerical experiments.

References

SHOWING 1-10 OF 213 REFERENCES

Random recurrent neural networks dynamics

Abstract.This paper is a review dealing with the study of large size random recurrent neural networks. The connection weights are varying according to a probability law and it is possible to predict

On the cognitive function of deterministic chaos in neural networks

TLDR
The authors show some properties of a chaotic net with respect to more classical models, such as the Rosenblatt perceptron, Hopfield net, and Boltzmann machine, and suggest a first step toward the construction of a learning procedure founded on chaotic dynamics.

Model neurons: From Hodgkin-Huxley to hopfield

TLDR
The methods establish direct connections between highly simpliied models of neuronal dynamics and more realistic descriptions in terms of time-and voltage-dependent conductances and linear and nonlinear integrate and re models.

A Mathematical Foundation for Statistical Neurodynamics

TLDR
This work first elucidate the stochastic structures of random nerve nets and derives macroscopic state equations which apply to a wide range of ensembles of random nets which are shown to hold in a weak sense.

Self-organization and dynamics reduction in recurrent networks: stimulus presentation and learning

A discrete time neural network model with spiking neurons

  • B. Cessac
  • Computer Science
    Journal of mathematical biology
  • 2008
TLDR
Though the dynamics of membrane potential is generically periodic, it has a weak form of initial conditions sensitivity due to the presence of a sharp threshold in the model definition, and exhibits a dynamical regime indistinguishable from chaos in numerical experiments.

On bifurcations and chaos in random neural networks

Chaos in nervous system is a fascinating but controversial field of investigation. To approach the role of chaos in the real brain, we theoretically and numerically investigate the occurrence of

CONTROL OF THE TRANSITION TO CHAOS IN NEURAL NETWORKS WITH RANDOM CONNECTIVITY

The occurrence of chaos in recurrent neural networks is supposed to depend on the architecture and on the synaptic coupling strength. It is studied here for a randomly diluted architecture. We

Spontaneous dynamics and associative learning in an asymmetric recurrent random neural network

TLDR
This paper uses a mean-field theoretical statement to determine the spontaneous dynamics of an assymetric recurrent neural network and proposes a Hebb-like learning rule to store a pattern as a limit cycle or strange attractor.

How brains make chaos in order to make sense of the world

TLDR
A model to describe the neural dynamics responsible for odor recognition and discrimination is developed and it is hypothesized that chaotic behavior serves as the essential ground state for the neural perceptual apparatus and a mechanism for acquiring new forms of patterned activity corresponding to new learned odors is proposed.
...