Cortical credit assignment by Hebbian, neuromodulatory and inhibitory plasticity.
@article{Aljadeff2019CorticalCA, title={Cortical credit assignment by Hebbian, neuromodulatory and inhibitory plasticity.}, author={Johnatan Aljadeff and James A. D’Amour and Rachel E. Field and Robert C. Froemke and Claudia Clopath}, journal={arXiv: Neurons and Cognition}, year={2019} }
The cortex learns to make associations between stimuli and spiking activity which supports behaviour. It does this by adjusting synaptic weights. The complexity of these transformations implies that synapses have to change without access to the full error information, a problem typically referred to as "credit-assignment". However, it remains unknown how the cortex solves this problem. We propose that a combination of plasticity rules, 1) Hebbian, 2) acetylcholine-dependent and 3) noradrenaline…
14 Citations
A solution to temporal credit assignment using cell-type-specific modulatory signals
- BiologybioRxiv
- 2020
This work re-analyzes the mathematical basis of gradient descent learning in recurrent spiking neural networks (RSNNs) in light of the recent single-cell transcriptomic evidence for cell-type-specific local neuropeptide signaling in the cortex and suggests a computationally efficient on-chip learning method for bio-inspired artificial intelligence.
Correlation-invariant synaptic plasticity
- Biology
- 2021
This work develops a theory for synaptic plasticity that is invariant to second-order correlations in the input and demonstrates how correlation-invariance enables biologically realistic models to develop sparse population codes, despite diverse levels of variability and heterogeneity.
Cell-type-specific modulatory signaling promotes learning in spiking neural networks
- Biology
- 2021
This work re-analyzes the mathematical basis of gradient descent learning in recurrent spiking neural networks (RSNNs) in light of the recent single-cell transcriptomic evidence for cell-type-specific local neuropeptide signaling in the cortex.
Complementary Inhibitory Weight Profiles Emerge from Plasticity and Allow Flexible Switching of Receptive Fields
- Biology, PsychologyThe Journal of Neuroscience
- 2020
This work emphasizes multiple roles of inhibition in cortical processing and provides a first mechanistic model for flexible receptive fields, showing how various synaptic plasticity rules allow for the emergence of diverse connectivity profiles and how their dynamic interaction creates a mechanism by which postsynaptic responses can quickly change.
1 Burst-dependent synaptic plasticity can coordinate learning in 2 hierarchical circuits 3
- Biology
- 2020
It is shown that if synaptic plasticity is regulated by high-frequency bursts of spikes, then neurons higher in a hierarchical circuit can coordinate the plasticity of lower-level connections.
Burst-dependent synaptic plasticity can coordinate learning in hierarchical circuits
- BiologyNature Neuroscience
- 2021
It is shown that if synaptic plasticity is regulated by high-frequency bursts of spikes, then pyramidal neurons higher in a hierarchical circuit can coordinate the plasticity of lower-level connections and solve challenging tasks that require deep network architectures.
Complementary inhibitory weight profiles emerge from plasticity and allow attentional switching of receptive fields
- Biology, Psychology
- 2019
It is confirmed that a neuron9s receptive field doesn't follow directly from the weight profiles of its presynaptic afferents, in line with recent experimental findings showing dramatic context-dependent changes of neurons9 receptive fields.
Reverse Differentiation via Predictive Coding
- Computer ScienceAAAI
- 2022
This work generalizes (PC and) Z-IL by directly defining it on computational graphs, and shows that it can perform exact reverse differentiation, which results in the first PC algorithm that is equivalent to BP in the way of updating parameters on any neural network.
Can the Brain Do Backpropagation? - Exact Implementation of Backpropagation in Predictive Coding Networks
- Computer ScienceNeurIPS
- 2020
A BL model is proposed that produces exactly the same updates of the neural weights as BP, while employing local plasticity, i.e., all neurons perform only local computations, done simultaneously and is modified to modify it to an alternative BL model that works fully autonomously.
Beyond accuracy: generalization properties of bio-plausible temporal credit assignment rules
- Computer ScienceArXiv
- 2022
It is demonstrated that state-of-the-art biologically-plausible learning rules for training RNNs exhibit worse and more variable generalization performance compared to their machine learning counterparts that follow the true gradient more closely, and a theorem is presented explaining this phenomenon.
References
SHOWING 1-10 OF 99 REFERENCES
Cerebellar learning using perturbations
- BiologybioRxiv
- 2018
This framework, stochastic gradient descent with estimated global errors, generates specific predictions for synaptic plasticity rules that contradict the current consensus, and in vitro plasticity experiments under physiological conditions verified the predictions, highlighting the sensitivity of plasticity studies to unphysiological conditions.
Optimal Properties of Analog Perceptrons with Excitatory Weights
- Computer Science, BiologyPLoS Comput. Biol.
- 2013
The optimal input has a sparse binary distribution, in good agreement with the burst firing of the Granule cells, and the weight distribution consists of a large fraction of silent synapses, as in previously studied binary perceptron models, and as seen experimentally.
Learning to solve the credit assignment problem
- Computer ScienceICLR
- 2020
A hybrid learning approach that learns to approximate the gradient, and can match or the performance of exact gradient-based learning in both feedforward and convolutional networks.
Balancing Feed-Forward Excitation and Inhibition via Hebbian Inhibitory Synaptic Plasticity
- BiologyPLoS Comput. Biol.
- 2012
Analysis of a model characterizing the stochastic dynamics of temporally anti-symmetric Hebbian spike timing dependent plasticity of feed forward excitatory and inhibitory synaptic inputs to a single post-synaptic cell shows that inhibitory HeBBian plasticity generates ‘negative feedback’ that balances excitation and inhibition, which contrasts with the ‘positive feedback” of excited Hebbians.
Sequential neuromodulation of Hebbian plasticity offers mechanism for effective reward-based navigation
- Biology, PsychologyeLife
- 2017
It is demonstrated that sequential neuromodulation of STDP by acetylcholine and dopamine offers an efficacious model of reward-based navigation, and also provides a possible mechanism for aligning the time scales of cellular and behavioral learning.
Dendritic cortical microcircuits approximate the backpropagation algorithm
- Computer Science, BiologyNeurIPS
- 2018
A novel view of learning on dendritic cortical circuits and on how the brain may solve the long-standing synaptic credit assignment problem is introduced, in which error-driven synaptic plasticity adapts the network towards a global desired output.
An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity
- Biology, Computer ScienceNeural Computation
- 2017
It is shown that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity.
Spatio-Temporal Credit Assignment in Neuronal Population Learning
- Biology, PsychologyPLoS Comput. Biol.
- 2011
This work presents a model of plasticity induction for reinforcement learning in a population of leaky integrate and fire neurons which is based on a cascade of synaptic memory traces and argues that, due to their comparative robustness, synaptic plasticity cascades are attractive basic models of reinforcementlearning in the brain.
Synaptic scaling rule preserves excitatory–inhibitory balance and salient neuronal network dynamics
- BiologyNature Neuroscience
- 2016
It is shown that synaptic strength scales with the number of connections K as ∼, close to the ideal theoretical value, suggesting that the synaptic scaling rule and resultant dynamics are emergent properties of networks in general.