# Towards a Biologically Plausible Backprop

@article{Scellier2016TowardsAB, title={Towards a Biologically Plausible Backprop}, author={Benjamin Scellier and Yoshua Bengio}, journal={ArXiv}, year={2016}, volume={abs/1602.05179} }

This work contributes several new elements to the quest for a biologically plausible implementation of backprop in brains. We introduce a very general and abstract framework for machine learning, in which the quantities of interest are defined implicitly through an energy function. In this framework, only one kind of neural computation is involved both for the first phase (when the prediction is made) and the second phase (after the target is revealed), like the contrastive Hebbian learning…

## 41 Citations

### Backpropagation and Biological Plausibility

- Computer ScienceArXiv
- 2018

It is shown that when framing supervised learning in the Lagrangian framework, while one can see a natural emergence of Backpropagation, biologically plausible local algorithms can also be devised that are based on the search for saddle points in the learning adjoint space composed of weights, neural outputs, andlagrangian multipliers.

### Feedforward Initialization for Fast Inference of Deep Generative Networks is biologically plausible

- Computer ScienceArXiv
- 2016

This work finds conditions under which a simple feedforward computation is a very good initialization for inference, after the input units are clamped to observed values.

### MSTDP: A More Biologically Plausible Learning

- Computer ScienceArXiv
- 2019

MSTDP, a new framework that uses only STDP rules for supervised and unsupervised learning and works like an auto-encoder by making each input neuron also an output neuron, which can make predictions or generate patterns in one model without additional configuration.

### Towards an integration of deep learning and neuroscience

- Computer Science, BiologybioRxiv
- 2016

It is argued that a range of implementations of credit assignment through multiple layers of neurons are compatible with current knowledge of neural circuitry, and that the brain’s specialized systems can be interpreted as enabling efficient optimization for specific problem classes.

### Toward an Integration of Deep Learning and Neuroscience

- Computer Science, BiologyFront. Comput. Neurosci.
- 2016

It is argued that a range of implementations of credit assignment through multiple layers of neurons are compatible with current knowledge of neural circuitry, and that the brain's specialized systems can be interpreted as enabling efficient optimization for specific problem classes.

### A Constrained-Based Approach to Machine Learning

- Computer Science2018 14th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS)
- 2018

This paper promotes a constrained-based approach to machine learning as a natural evolution to classic distinction between supervised, unsupervised and semi-supervised learning and proposes an algorithm that goes beyond the arguable Backpropagation biological plausibility.

### Training the Hopfield Neural Network for Classification Using a STDP-Like Rule

- Computer ScienceICONIP
- 2017

It is shown that the well-known Hopfield neural network (HNN) can be trained in a biologically plausible way and several HNNs with one or two hidden layers are trained on the MNIST dataset and all of them converge to low training errors.

### Biologically feasible deep learning with segregated dendrites

- Computer Science, Biology
- 2016

A spiking, continuous-time neural network model that learns to categorize images from the MNIST data-set with local synaptic weight updates and demonstrates that deep learning can be achieved within a biologically feasible framework using segregated dendritic compartments.

### Applications of the Free Energy Principle to Machine Learning and Neuroscience

- Computer ScienceArXiv
- 2021

Predictive coding, a neurobiologically plausible process theory derived from the free energy principle, is focused on, showing how predictive coding can be scaled up and extended to be more biologically plausible, and elucidating its close links with other methods such as Kalman Filtering.

### Supervised Learning in Neural Networks: Feedback-Network-Free Implementation and Biological Plausibility

- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2022

A new learning algorithm is proposed that is mathematically equivalent to the backpropagation algorithm but does not require a feedback network, which eliminates the need for two-phase adaptation and means neurons can adapt asynchronously and concurrently in a way analogous to that of biological neurons.

## References

SHOWING 1-10 OF 35 REFERENCES

### Feedforward Initialization for Fast Inference of Deep Generative Networks is biologically plausible

- Computer ScienceArXiv
- 2016

This work finds conditions under which a simple feedforward computation is a very good initialization for inference, after the input units are clamped to observed values.

### Towards Biologically Plausible Deep Learning

- Computer ScienceArXiv
- 2015

The theory about the probabilistic interpretation of auto-encoders is extended to justify improved sampling schemes based on the generative interpretation of denoising auto- Encoder, and these ideas are validated on generative learning tasks.

### Free-energy and the brain

- BiologySynthese
- 2007

It is suggested that these perceptual processes are just one emergent property of systems that conform to a free-energy principle, and that the system’s state and structure encode an implicit and probabilistic model of the environment.

### Biologically Plausible Error-Driven Learning Using Local Activation Differences: The Generalized Recirculation Algorithm

- Computer ScienceNeural Computation
- 1996

All known fully general error-driven learning algorithms that use local activation-based variables in deterministic networks can be considered variations of the GeneRec algorithm (and indirectly, of the backpropagation algorithm).

### Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network

- Computer ScienceNeural Computation
- 2003

A special case in which they are identical: a multilayer perceptron with linear output units, to which weak feedback connections have been added suggests that the functionality of backpropagation can be realized alternatively by a Hebbian-type learning algorithm, which is suitable for implementation in biological networks.

### Random feedback weights support learning in deep neural networks

- Computer ScienceArXiv
- 2014

A surprisingly simple algorithm is presented, which assigns blame by multiplying error signals by random synaptic weights, and it is shown that a network can learn to extract useful information from signals sent through these random feedback connections, in essence, the network learns to learn.

### Towards deep learning with spiking neurons in energy based models with contrastive Hebbian plasticity

- Computer ScienceArXiv
- 2016

Preliminary results indicate that it is possible to learn a non-linear regression task with hidden layers, spiking neurons and a local synaptic plasticity rule.

### A neuronal learning rule for sub-millisecond temporal coding

- BiologyNature
- 1996

A modelling study based on computer simulations of a neuron in the laminar nucleus of the barn owl shows that the necessary degree of coherence in the signal arrival times can be attained during ontogenetic development by virtue of an unsupervised hebbian learning rule.

### Early Inference in Energy-Based Models Approximates Back-Propagation

- Computer ScienceArXiv
- 2015

We show that Langevin MCMC inference in an energy-based model with latent variables has the property that the early steps of inference, starting from a stationary point, correspond to propagating…