#### Filter Results:

#### Publication Year

2014

2016

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

- David Kappel, Bernhard Nessler, Wolfgang Maass
- PLoS Computational Biology
- 2014

In order to cross a street without being run over, we need to be able to extract very fast hidden causes of dynamically changing multi-modal sensory stimuli, and to predict their future evolution. We show here that a generic cortical microcircuit motif, pyramidal cells with lateral excitation and inhibition, provides the basis for this difficult but… (More)

- David Kappel, Stefan Habenschuss, Robert Legenstein, Wolfgang Maass, Jeff Beck
- PLoS Computational Biology
- 2015

General results from statistical learning theory suggest to understand not only brain computations, but also brain plasticity as probabilistic inference. But a model for that has been missing. We propose that inherently stochastic features of synaptic plasticity and spine motility enable cortical networks of neurons to carry out probabilistic inference by… (More)

NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse… (More)

We reexamine in this article the conceptual and mathematical framework for understanding the organization of plasticity in spiking neural networks. We propose that inherent stochasticity enables synaptic plasticity to carry out probabilistic inference by sampling from a posterior distribution of synaptic parameters. This view provides a viable alternative… (More)

- Elmar Rueckert, David Kappel, Daniel Tanneberg, Dejan Pecevski, Jan Peters
- Scientific reports
- 2016

A recurrent spiking neural network is proposed that implements planning as probabilistic inference for finite and infinite horizon tasks. The architecture splits this problem into two parts: The stochastic transient firing of the network embodies the dynamics of the planning task. With appropriate injected input this dynamics is shaped to generate… (More)

Experimental data show that synaptic connections are subject to stochastic processes , and that neural codes drift on larger time scales. These data suggest to consider besides maximum likelihood learning also sampling models for network plasticity (synaptic sampling), where the current network connectivity and parameter values are viewed as a sample from a… (More)

- David Kappel, Stefan Habenschuss, Robert Legenstein, Wolfgang Maass
- 2015

We provide here the full proof of Theorem 1 of the main text. For convenience, we first reiterate Eq. (12) and Theorem 1 from the main text. Consider the parameter dynamics (Eq. (12) in the main text) dθ i = b(θ i) ∂ ∂θ i log p S (θ) + b(θ i) ∂ ∂θ i log p N (x|θ) + T b (θ i) dt + 2T b(θ i) dW i (S1) (for i = 1,. .. , M). We show that the stochastic dynamics… (More)

- ‹
- 1
- ›