Learn More
We deal with the problem of bridging the gap between two scales in neuronal modeling. At the first (microscopic) scale, neurons are considered individually and their behavior described by stochastic differential equations that govern the time variations of their membrane potentials. They are coupled by synaptic connections acting on their resulting(More)
We present a mathematical analysis of networks with integrate-and-fire (IF) neurons with conductance based synapses. Taking into account the realistic fact that the spike time is only known within some finite precision, we propose a model where spikes are effective at times multiple of a characteristic time scale delta, where delta can be arbitrary small(More)
The aim of the present paper is to study the effects of Hebbian learning in random recurrent neural networks with biological connectivity, i.e. sparse connections and separate populations of excitatory and inhibitory neurons. We furthermore consider that the neuron dynamics may occur at a (shorter) time scale than synaptic plasticity and consider the(More)
It is assumed that complex perceptual or sensori-motor tasks are the result of neural network dynamics and are expressed by spike trains containing the neural code. Hence, in this context two main questions are (i) How to characterize the statistical properties of sequences the spikes trains produced by neuronal networks and (ii) What are the effects of(More)
We discuss the ability of a model of network with nonlinear units and chaotic dynamics to transmit signals, on the basis of a linear response theory developed by Ruelle [D. Ruelle, J. Stat. Phys. 95, 393 (1999)] for dissipative systems. We discuss in particular how the dynamics may interfere with the graph topology to produce an effective transmission(More)
We consider a conductance-based neural network inspired by the generalized Integrate and Fire model introduced by Rudolph and Destexhe in 1996. We show the existence and uniqueness of a unique Gibbs distribution characterizing spike train statistics. The corresponding Gibbs potential is explicitly computed. These results hold in the presence of a(More)
Although the spike-trains in neural networks are mainly constrained by the neural dynamics itself, global temporal constraints (refractoriness, time precision, propagation delays, ..) are also to be taken into account. These constraints are revisited in this paper in order to use them in event-based simulation paradigms. We first review these constraints,(More)