• Corpus ID: 248693162

Dendritic predictive coding: A theory of cortical computation with spiking neurons

  title={Dendritic predictive coding: A theory of cortical computation with spiking neurons},
  author={Fabian A. Mikulasch and Lucas Rudelt and Michael Wibral and Viola Priesemann},
These authors contributed equally Top-down feedback in cortex is critical for guiding sensory processing, which has prominently been formalized in the theory of hierarchical predictive coding (hPC). However, experimental evidence for error units, which are central to the theory, is inconclusive, and it remains unclear how hPC can be implemented with spiking neurons. To address this, we connect hPC to existing work on efficient coding in balanced networks with lateral inhibition, and predictive… 

Figures from this paper

Lateral predictive coding revisited: Internal model, symmetry breaking, and response time
It is found that learning will generally break the interaction symmetry between peer neurons, and that high input correlation between two neurons does not necessarily imply strong direct interactions between them.
The least-control principle for learning at equilibrium
The principle casts learning as a least-control problem, where it is shown that incorporating learning signals within a dynamics as an optimal control enables transmitting credit assignment information in previously unknown ways, avoids storing intermediate states in memory, and does not rely on infinitesimal learning signals.


Learning prediction error neurons in a canonical interneuron circuit
A well-orchestrated interplay of three interneuron types shapes the development and refinement of negative prediction-error neurons in a computational model of mouse primary visual cortex, making a range of testable predictions that may shed light on the circuitry underlying the neural computation of prediction errors.
Predictive Coding of Dynamical Variables in Balanced Spiking Networks
The approach suggests that spikes do matter when considering how the brain computes, and that the reliability of cortical representations could have been strongly underestimated.
Dendritic cortical microcircuits approximate the backpropagation algorithm
A novel view of learning on dendritic cortical circuits and on how the brain may solve the long-standing synaptic credit assignment problem is introduced, in which error-driven synaptic plasticity adapts the network towards a global desired output.
Spike-Based Population Coding and Working Memory
It is proposed that probability distributions are inferred spike-per-spike in recurrently connected networks of integrate-and-fire neurons, which can combine sensory cues optimally, track the state of a time-varying stimulus and memorize accumulated evidence over periods much longer than the time constant of single neurons.
Connectivity reflects coding: a model of voltage-based STDP with homeostasis
A model of spike timing–dependent plasticity (STDP) in which synaptic changes depend on presynaptic spike arrival and the postsynaptic membrane potential, filtered with two different time constants is created and found that the plasticity rule led not only to development of localized receptive fields but also to connectivity patterns that reflect the neural code.
Predictive coding in balanced neural networks with noise, chaos and delays
This work provides and solves a general theoretical framework for dissecting the differential contributions neural noise, synaptic disorder, chaos, synaptic delays, and balance to the fidelity of predictive neural codes, reveals the fundamental role that balance plays in achieving superclassical scaling, and unifies previously disparate models in theoretical neuroscience.
Modelling plasticity in dendrites: from single cells to networks
Emergence of synaptic organization and computation in dendrites
Recent experimental and theoretical research on the developmental emergence of this synaptic organization and its impact on neural computations are summarized.
Causal Inference and Explaining Away in a Spiking Network
It is demonstrated that a family of high-dimensional quadratic optimization problems with non-negativity constraints can be solved exactly and efficiently by a network of spiking neurons.
Active dendrites enable strong but sparse inputs to determine orientation selectivity
It is predicted that dendritic excitability allows the 1% strongest synaptic inputs of a neuron to control the tuning of its output, which would allow smaller subcircuits consisting of only a few strongly connected neurons to achieve selectivity for specific sensory features.