Bruno Cessac

Learn More
We provide rigorous and exact results characterizing the statistics of spike trains in a network of leaky Integrate-and-Fire neurons, where time is discrete and where neurons are submitted to noise, without restriction on the synaptic weights. We show the existence and uniqueness of an invariant measure of Gibbs type and discuss its properties. We also(More)
We present a mathematical analysis of a networks with Integrate-and-Fire neurons and adaptive conductances. Taking into account the realistic fact that the spike time is only known within some finite precision, we propose a model where spikes are effective at times multiple of a characteristic time scale δ, where δ can be arbitrary small (in particular,(More)
We derive rigorous results describing the asymptotic dynamics of a discrete time model of spiking neurons introduced in Soula et al. (Neural Comput. 18, 1, 2006). Using symbolic dynamic techniques we show how the dynamics of membrane potential has a one to one correspondence with sequences of spikes patterns ("raster plots"). Moreover, though the dynamics(More)
We deal with the problem of bridging the gap between two scales in neuronal modeling. At the first (microscopic) scale, neurons are considered individually and their behavior described by stochastic differential equations that govern the time variations of their membrane potentials. They are coupled by synaptic connections acting on their resulting(More)
It is assumed that complex perceptual or sensori-motor tasks are the result of neural network dynamics and are expressed by spike trains containing the neural code. Hence, in this context two main questions are (i) How to characterize the statistical properties of sequences the spikes trains produced by neuronal networks and (ii) What are the effects of(More)
We present a method to estimate Gibbs distributions with spatio-temporal constraints on spike trains statistics. We apply this method to spike trains recorded from ganglion cells of the salamander retina, in response to natural movies. Our analysis, restricted to a few neurons, performs more accurately than pairwise synchronization models (Ising) or the(More)
We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through(More)
We consider a conductance-based neural network inspired by the generalized Integrate and Fire model introduced by Rudolph and Destexhe in 1996. We show the existence and uniqueness of a unique Gibbs distribution characterizing spike train statistics. The corresponding Gibbs potential is explicitly computed. These results hold in the presence of a(More)
We apply the linear response theory developed by Ruelle [J. Stat. Phys. 95, 393 (1999)] to analyze how a periodic signal of weak amplitude, superimposed upon a chaotic background, is transmitted in a network of nonlinearly interacting units. We numerically compute the complex susceptibility and show the existence of specific poles (stable resonances)(More)
We investigate the effect of electric synapses (gap junctions) on collective neuronal dynamics and spike statistics in a conductance-based Integrate-and-Fire neural network, driven by a Brownian noise, where conductances depend upon spike history. We compute explicitly the time evolution operator and show that, given the spike-history of the network and the(More)