• Corpus ID: 233407801

Learning Bayes-optimal dendritic opinion pooling

@inproceedings{Jordan2021LearningBD,
  title={Learning Bayes-optimal dendritic opinion pooling},
  author={Jakob Jordan and Jo{\~a}o Sacramento and Willem Am Wybo and Mihai A. Petrovici and Walter Senn},
  year={2021}
}
Pooling different opinions and weighting them according to their reliability is conducive to making good decisions. We demonstrate that single cortical neurons, through the biophysics of conductancebased coupling, perform such complex probabilistic computations via their natural dynamics. While the effective computation can be described as a feedforward process, the implementation critically relies on the bidirectional current flow along the dendritic tree. We suggest that dendritic membrane… 

References

SHOWING 1-10 OF 58 REFERENCES
Natural gradient learning for spiking neurons
TLDR
The approach provides a unified, normative framework for both homo- and heterosynaptic plasticity in structured neurons and predicts a number of related biological phenomena.
Learning by the Dendritic Prediction of Somatic Spiking
Dendritic computation.
TLDR
Why dendritic properties may be essential for the computations performed by the neuron and the network are discussed and theoretical and experimental examples to support this view are provided.
Cortical-like dynamics in recurrent circuits optimized for sampling-based probabilistic inference
TLDR
The results suggest that the basic motifs of cortical dynamics emerge as a consequence of the efficient implementation of the same computational function—fast sampling-based inference—and predict further properties of these motifs that can be tested in future experiments.
Stochasticity from function - why the Bayesian brain may need no noise
Stochastic inference with spiking neurons in the high-conductance state
TLDR
It is shown how an ensemble of leaky integrate-and-fire neurons with conductance-based synapses embedded in a spiking environment can attain the correct firing statistics for sampling from a well-defined target distribution and establishes a rigorous link between deterministic neuron models and functional stochastic dynamics on the network level.
Bayesian inference with probabilistic population codes
TLDR
This work argues that the Poisson-like variability observed in cortex reduces a broad class of Bayesian inference to simple linear combinations of populations of neural activity, and demonstrates that these results hold for arbitrary probability distributions over the stimulus, for tuning curves of arbitrary shape and for realistic neuronal variability.
The Bayesian brain: the role of uncertainty in neural coding and computation
Data-driven reduction of dendritic morphologies with preserved dendro-somatic responses
TLDR
A flexible and fast method to obtain simplified neuron models at any level of complexity, through carefully chosen parameter fits, that fits reduced models directly from experimental data, without requiring morphological reconstructions is presented.
...
1
2
3
4
5
...