Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity

@article{Nessler2013BayesianCE,
  title={Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity},
  author={Bernhard Nessler and Michael Pfeiffer and Lars Buesing and W. Maass},
  journal={PLoS Computational Biology},
  year={2013},
  volume={9}
}
The principles by which networks of neurons compute, and how spike-timing dependent plasticity (STDP) of synaptic weights generates and maintains their computational function, are unknown. Preceding work has shown that soft winner-take-all (WTA) circuits, where pyramidal neurons inhibit each other via interneurons, are a common motif of cortical microcircuits. We show through theoretical analysis and computer simulations that Bayesian computation is induced in these network motifs through STDP… Expand
Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition
TLDR
This study shows how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model, which permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Expand
Inhibitory networks orchestrate the self-organization of computational function in cortical microcircuit motifs through STDP
TLDR
The interaction of PCs with two types of inhibitory networks, that reflect salient properties of somatic- targeting neurons and dendritic-targeting neurons, provides a good approximation to the theoretically optimal lateral inhibition needed for the self-organization of these network motifs. Expand
A probabilistic model for learning in cortical microcircuit motifs with data-based divisive inhibition
TLDR
This theoretical analysis corroborates a preceding modelling study which suggested that the learning dynamics of this layer 2/3 microcircuit motif extracts a specific modular representation of the input and thus performs blind source separation on the input statistics. Expand
Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123
TLDR
It is shown that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Expand
Spike-Based Bayesian-Hebbian Learning of Temporal Sequences
TLDR
A modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule, finding that the formation of distributed memories can be acquired through plasticity. Expand
Feedback Inhibition Shapes Emergent Computational Properties of Cortical Microcircuit Motifs
TLDR
It is proposed that spike timing-dependent plasticity enables this microcircuit motif to perform a fundamental computational operation on neural activity patterns, andSimulations of this model predict that sparse assembly codes emerge in this micro Circuit motif under spike timing, and show that different assemblies will represent different hidden sources of upstream firing activity. Expand
Emergence of Dynamic Memory Traces in Cortical Microcircuit Models through STDP
TLDR
It is shown here that stimulus-specific assemblies of neurons emerge automatically through spike-timing-dependent plasticity (STDP) in a simple cortical microcircuit model, and the emergent assembly codes add an important computational capability to standard models for online computations in cortical micro Circuits: the capability to integrate information from long-term memory with information from novel spike inputs. Expand
Synaptic and nonsynaptic plasticity approximating probabilistic inference
TLDR
The model provides a biophysical realization of Bayesian computation by reconciling several observed neural phenomena whose functional effects are only partially understood in concert, and support the view that neurons can represent information in the form of probability distributions. Expand
Hierarchical Bayesian Inference and Learning in Spiking Neural Networks
TLDR
A hierarchical network of winner-take-all circuits which can carry out hierarchical Bayesian inference and learning through a spike-based variational expectation maximization (EM) algorithm is proposed and the utility of this spiking neural network is demonstrated on the MNIST benchmark for unsupervised classification of handwritten digits. Expand
Dendritic error backpropagation in deep cortical microcircuits
TLDR
A multi-area neuronal network model in which synaptic plasticity continuously adapts the network towards a global desired output is introduced, which approximates the classical error backpropagation algorithm and suggests a biological implementation of deep learning. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 165 REFERENCES
STDP enables spiking neurons to detect hidden causes of their inputs
TLDR
It is shown here that STDP, in conjunction with a stochastic soft winner-take-all (WTA) circuit, induces spiking neurons to generate through their synaptic weights implicit internal models for subclasses (or "causes") of the high-dimensional spike patterns of hundreds of pre-synaptic neurons. Expand
Homeostatic plasticity in Bayesian spiking networks as Expectation Maximization with posterior constraints
TLDR
Homeostatic plasticity can be understood as the enforcement of a 'balancing' posterior constraint during probabilistic inference and learning with Expectation Maximization and the theory provides a novel perspective on the interplay of homeostatic processes and synaptic plasticity in cortical microcircuits. Expand
Hierarchical Bayesian Inference in Networks of Spiking Neurons
TLDR
It is shown that recurrent networks of noisy integrate-and-fire neurons can perform approximate Bayesian inference for dynamic and hierarchical graphical models. Expand
Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons
TLDR
A neural network model is proposed and it is shown by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time. Expand
Probabilistic Inference in General Graphical Models through Sampling in Stochastic Networks of Spiking Neurons
TLDR
Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization and can be scaled up to neural emulations of probabilistic inference in fairly large graphical models. Expand
Computation with Spikes in a Winner-Take-All Network
TLDR
This work extends previous theoretical results showing that a WTA recurrent network receiving regular spike inputs can select the correct winner within one interspike interval, and uses a simplified Markov model of the spiking network to examine analytically the ability of a spike-based WTA network to discriminate the statistics of inputs ranging from stationary regular to nonstationary Poisson events. Expand
Spike timing-dependent plasticity as dynamic filter
TLDR
A minimal model formulated in terms of differential equations that predicts synaptic strengthening for synchronous rate modulations in STDP and provides a general framework for investigating the joint dynamics of neuronal activity and the CD of STDP in both spike-based as well as rate-based neuronal network models. Expand
Competitive STDP-Based Spike Pattern Learning
TLDR
These results illustrate how the brain could easily encode and decode information in the spike times, a theory referred to as temporal coding, and how STDP could play a key role by detecting repeating patterns and generating selective response to them. Expand
Spike timing-dependent plasticity: a Hebbian learning rule.
TLDR
This work has examined the functional consequences of STDP directly in an increasing number of neural circuits in vivo, and revealed several layers of complexity in STDP, including its dependence on dendritic location, the nonlinear integration of synaptic modification induced by complex spike trains, and the modulation ofSTDP by inhibitory and neuromodulatory inputs. Expand
Connectivity reflects coding: a model of voltage-based STDP with homeostasis
TLDR
A model of spike timing–dependent plasticity (STDP) in which synaptic changes depend on presynaptic spike arrival and the postsynaptic membrane potential, filtered with two different time constants is created and found that the plasticity rule led not only to development of localized receptive fields but also to connectivity patterns that reflect the neural code. Expand
...
1
2
3
4
5
...