• Corpus ID: 207780248

Cortical credit assignment by Hebbian, neuromodulatory and inhibitory plasticity.

@article{Aljadeff2019CorticalCA,
  title={Cortical credit assignment by Hebbian, neuromodulatory and inhibitory plasticity.},
  author={Johnatan Aljadeff and James A. D’Amour and Rachel E. Field and Robert C. Froemke and Claudia Clopath},
  journal={arXiv: Neurons and Cognition},
  year={2019}
}
The cortex learns to make associations between stimuli and spiking activity which supports behaviour. It does this by adjusting synaptic weights. The complexity of these transformations implies that synapses have to change without access to the full error information, a problem typically referred to as "credit-assignment". However, it remains unknown how the cortex solves this problem. We propose that a combination of plasticity rules, 1) Hebbian, 2) acetylcholine-dependent and 3) noradrenaline… 

Figures and Tables from this paper

A solution to temporal credit assignment using cell-type-specific modulatory signals

This work re-analyzes the mathematical basis of gradient descent learning in recurrent spiking neural networks (RSNNs) in light of the recent single-cell transcriptomic evidence for cell-type-specific local neuropeptide signaling in the cortex and suggests a computationally efficient on-chip learning method for bio-inspired artificial intelligence.

Correlation-invariant synaptic plasticity

This work develops a theory for synaptic plasticity that is invariant to second-order correlations in the input and demonstrates how correlation-invariance enables biologically realistic models to develop sparse population codes, despite diverse levels of variability and heterogeneity.

Cell-type-specific modulatory signaling promotes learning in spiking neural networks

This work re-analyzes the mathematical basis of gradient descent learning in recurrent spiking neural networks (RSNNs) in light of the recent single-cell transcriptomic evidence for cell-type-specific local neuropeptide signaling in the cortex.

Complementary Inhibitory Weight Profiles Emerge from Plasticity and Allow Flexible Switching of Receptive Fields

This work emphasizes multiple roles of inhibition in cortical processing and provides a first mechanistic model for flexible receptive fields, showing how various synaptic plasticity rules allow for the emergence of diverse connectivity profiles and how their dynamic interaction creates a mechanism by which postsynaptic responses can quickly change.

1 Burst-dependent synaptic plasticity can coordinate learning in 2 hierarchical circuits 3

It is shown that if synaptic plasticity is regulated by high-frequency bursts of spikes, then neurons higher in a hierarchical circuit can coordinate the plasticity of lower-level connections.

Burst-dependent synaptic plasticity can coordinate learning in hierarchical circuits

It is shown that if synaptic plasticity is regulated by high-frequency bursts of spikes, then pyramidal neurons higher in a hierarchical circuit can coordinate the plasticity of lower-level connections and solve challenging tasks that require deep network architectures.

Complementary inhibitory weight profiles emerge from plasticity and allow attentional switching of receptive fields

It is confirmed that a neuron9s receptive field doesn't follow directly from the weight profiles of its presynaptic afferents, in line with recent experimental findings showing dramatic context-dependent changes of neurons9 receptive fields.

Reverse Differentiation via Predictive Coding

This work generalizes (PC and) Z-IL by directly defining it on computational graphs, and shows that it can perform exact reverse differentiation, which results in the first PC algorithm that is equivalent to BP in the way of updating parameters on any neural network.

Can the Brain Do Backpropagation? - Exact Implementation of Backpropagation in Predictive Coding Networks

A BL model is proposed that produces exactly the same updates of the neural weights as BP, while employing local plasticity, i.e., all neurons perform only local computations, done simultaneously and is modified to modify it to an alternative BL model that works fully autonomously.

Beyond accuracy: generalization properties of bio-plausible temporal credit assignment rules

It is demonstrated that state-of-the-art biologically-plausible learning rules for training RNNs exhibit worse and more variable generalization performance compared to their machine learning counterparts that follow the true gradient more closely, and a theorem is presented explaining this phenomenon.

References

SHOWING 1-10 OF 99 REFERENCES

Cerebellar learning using perturbations

This framework, stochastic gradient descent with estimated global errors, generates specific predictions for synaptic plasticity rules that contradict the current consensus, and in vitro plasticity experiments under physiological conditions verified the predictions, highlighting the sensitivity of plasticity studies to unphysiological conditions.

Dendritic solutions to the credit assignment problem

Optimal Properties of Analog Perceptrons with Excitatory Weights

The optimal input has a sparse binary distribution, in good agreement with the burst firing of the Granule cells, and the weight distribution consists of a large fraction of silent synapses, as in previously studied binary perceptron models, and as seen experimentally.

Learning to solve the credit assignment problem

A hybrid learning approach that learns to approximate the gradient, and can match or the performance of exact gradient-based learning in both feedforward and convolutional networks.

Balancing Feed-Forward Excitation and Inhibition via Hebbian Inhibitory Synaptic Plasticity

Analysis of a model characterizing the stochastic dynamics of temporally anti-symmetric Hebbian spike timing dependent plasticity of feed forward excitatory and inhibitory synaptic inputs to a single post-synaptic cell shows that inhibitory HeBBian plasticity generates ‘negative feedback’ that balances excitation and inhibition, which contrasts with the ‘positive feedback” of excited Hebbians.

Sequential neuromodulation of Hebbian plasticity offers mechanism for effective reward-based navigation

It is demonstrated that sequential neuromodulation of STDP by acetylcholine and dopamine offers an efficacious model of reward-based navigation, and also provides a possible mechanism for aligning the time scales of cellular and behavioral learning.

Dendritic cortical microcircuits approximate the backpropagation algorithm

A novel view of learning on dendritic cortical circuits and on how the brain may solve the long-standing synaptic credit assignment problem is introduced, in which error-driven synaptic plasticity adapts the network towards a global desired output.

An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity

It is shown that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity.

Spatio-Temporal Credit Assignment in Neuronal Population Learning

This work presents a model of plasticity induction for reinforcement learning in a population of leaky integrate and fire neurons which is based on a cascade of synaptic memory traces and argues that, due to their comparative robustness, synaptic plasticity cascades are attractive basic models of reinforcementlearning in the brain.

Synaptic scaling rule preserves excitatory–inhibitory balance and salient neuronal network dynamics

It is shown that synaptic strength scales with the number of connections K as ∼, close to the ideal theoretical value, suggesting that the synaptic scaling rule and resultant dynamics are emergent properties of networks in general.
...