Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons

  title={Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons},
  author={Lars Buesing and Johannes Bill and Bernhard Nessler and Wolfgang Maass},
  journal={PLoS Computational Biology},
The organization of computations in networks of spiking neurons in the brain is still largely unknown, in particular in view of the inherently stochastic features of their firing activity and the experimentally observed trial-to-trial variability of neural systems in the brain. In principle there exists a powerful computational framework for stochastic computations, probabilistic inference by sampling, which can explain a large number of macroscopic experimental data in neuroscience and… 

Figures and Tables from this paper

Probabilistic Inference in General Graphical Models through Sampling in Stochastic Networks of Spiking Neurons

Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization and can be scaled up to neural emulations of probabilistic inference in fairly large graphical models.

Ensembles of Spiking Neurons with Noise Support Optimal Probabilistic Inference in a Dynamically Changing Environment

The viability of this new approach towards neural coding and computation, which makes use of the inherent parallelism of generic neural circuits, is demonstrated by showing that this model can explain experimentally observed firing activity of cortical neurons for a variety of tasks that require rapid temporal integration of sensory information.

Oscillatory background activity implements a backbone for sampling-based computations in spiking neural networks

This work shows that both in current-based and conductance-based neuron models, the level of background activity effectively defines the sampling temperature of the network, and demonstrates that background oscillations can structure stochastic computations into discrete sampling episodes.

Computing with noise in spiking neural networks

This work provides a novel analytical description of the neural response function with an unprecedented range of validity and implements interconnected sampling networks which exploit their activity as noise resource to maintain a stochastic firing regime.

Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition

This study shows how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model, which permits a rigorous analytical treatment of experience-dependent plasticity on the network level.

Natural gradient enables fast sampling in spiking neural networks

This work shows that two classes of spiking samplers—efficient balanced spiking networks that simulate Langevin sampling, and networks with probabilistic spike rules that implement Metropolis-Hastings sampling—can be unified within a common framework, and suggests design principles for algorithms for sampling-based probabilism inference in spiking neural networks.

Causal Inference and Explaining Away in a Spiking Network

It is demonstrated that a family of high-dimensional quadratic optimization problems with non-negativity constraints can be solved exactly and efficiently by a network of spiking neurons.

Stochasticity from function - why the Bayesian brain may need no noise

Neurons as Monte Carlo Samplers: Bayesian Inference and Learning in Spiking Networks

The proposed spiking network model provides a functional explanation for the Poisson-like noise commonly observed in cortical responses and shows how such a neuronal network with synaptic plasticity can implement a form of Bayesian inference similar to Monte Carlo methods such as particle filtering.

A theoretical basis for efficient computations with noisy spiking neurons

A new theoretical framework for organizing computations of networks of spiking neurons is presented and it is shown that a suitable design enables them to solve hard constraint satisfaction problems from the domains of planning - optimization and verification - logical inference.



Belief Propagation in Networks of Spiking Neurons

This work shows in detailed simulations how the belief propagation algorithm on a factor graph can be embedded in a network of spiking neurons, and demonstrates good agreement between the performance of the networks and the direct numerical evaluation of belief propagation.

Probabilistic Computation in Spiking Populations

This work suggests a model based on standard neural architecture and activations for spiking neurons, and applies it to a sensorimotor integration task that provides a particularly compelling example of dynamic probabilistic computation.

Bayesian Spiking Neurons I: Inference

The dynamics of spiking neurons can be interpreted as a form of Bayesian inference in time, and firing statistics are close to Poisson, albeit providing a deterministic representation of probabilities.

Spatio-temporal correlations and visual signalling in a complete neuronal population

The functional significance of correlated firing in a complete population of macaque parasol retinal ganglion cells is analysed using a model of multi-neuron spike responses, and a model-based approach reveals the role of correlated activity in the retinal coding of visual stimuli, and provides a general framework for understanding the importance of correlation activity in populations of neurons.

Cortical Circuitry Implementing Graphical Models

Following previous work, which proposed relations between graphical models and the large-scale cortical anatomy, this work focuses on the cortical microcircuitry and proposes how anatomical and physiological aspects of the local circuitry may map onto elements of the graphical model implementation.

Stochastic Ion Channel Gating in Dendritic Neurons: Morphology Dependence and Probabilistic Synaptic Activation of Dendritic Spikes

The simulations suggest that a direct consequence of stochastic gating of intrinsic ion channels is that spike output may instead be a probabilistic function of patterns of synaptic input to dendrites.

The highly irregular firing of cortical cells is inconsistent with temporal integration of random EPSPs

  • W. SoftkyC. Koch
  • Biology
    The Journal of neuroscience : the official journal of the Society for Neuroscience
  • 1993
It is argued that neurons that act as temporal integrators over many synaptic inputs must fire very regularly and only in the presence of either fast and strong dendritic nonlinearities or strong synchronization among individual synaptic events will the degree of predicted variability approach that of real cortical neurons.

Interpreting Neural Response Variability as Monte Carlo Sampling of the Posterior

This paper proposes an alternative view in which the variability of cortical sensory neurons is related to the uncertainty, about world parameters, which is inherent in the sensory stimulus, and provides simulations suggesting how some aspects of response variability might be understood in this framework.

Bayesian brain : probabilistic approaches to neural coding

Bayesian Brain brings together contributions from both experimental and theoretical neuroscientists that examine the brain mechanisms of perception, decision making, and motor control according to the concepts of Bayesian estimation.