Activation Learning by Local Competitions
@article{Zhou2022ActivationLB, title={Activation Learning by Local Competitions}, author={Hongchao Zhou}, journal={ArXiv}, year={2022}, volume={abs/2209.13400} }
The backpropagation that drives the success of deep learning is most likely different from the learning mechanism of the brain. In this paper, we develop a biology-inspired learning rule that discovers features by local competitions among neurons, following the idea of Hebb’s famous proposal. It is demonstrated that the unsupervised features learned by this local learning rule can serve as a pre-training model to improve the performance of some supervised learning tasks. More importantly, this…
Figures from this paper
References
SHOWING 1-10 OF 67 REFERENCES
Unsupervised learning by competing hidden units
- Computer ScienceProceedings of the National Academy of Sciences
- 2019
A learning algorithm is designed that utilizes global inhibition in the hidden layer and is capable of learning early feature detectors in a completely unsupervised way, and which is motivated by Hebb’s idea that change of the synapse strength should be local.
Hebbian Learning Meets Deep Convolutional Neural Networks
- Computer ScienceICIAP
- 2019
The use of the Hebbian learning rule is investigated when training Deep Neural Networks for image classification by proposing a novel weight update rule for shared kernels in DCNNs.
HebbNet: A Simplified Hebbian Learning Framework to do Biologically Plausible Learning
- Computer ScienceICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2021
This work introduces a new Hebbian learning based neural network, called HebbNet, that includes an updated activation threshold and gradient sparsity to the first principles of Hebbia, and improves training dynamics by reducing the number of training epochs and making training a one-step process from a two- step process.
Multi-layer Hebbian networks with modern deep learning frameworks
- Computer ScienceArXiv
- 2021
It is hypothesized that more advanced techniques (dynamic stimuli, trace learning, feedback connections, etc.), together with the massive computational boost offered by modern deep learning frameworks, could greatly improve the performance and biological relevance of multi-layer Hebbian networks.
Local Unsupervised Learning for Image Analysis
- Computer ScienceArXiv
- 2019
The design of a local algorithm that can learn convolutional filters at scale on large image datasets and a successful transfer of learned representations between CIFAR-10 and ImageNet 32x32 datasets hint at the possibility that local unsupervised training might be a powerful tool for learning general representations (without specifying the task) directly from unlabeled data.
Biologically Inspired Feedforward Supervised Learning for Deep Self-Organizing Map Networks
- Computer ScienceArXiv
- 2017
A novel deep neural network and its supervised learning method that uses a feedforward supervisory signal that is inspired by the human visual system and performs human-like association-based learning without any backward error propagation is proposed.
Hebbian Deep Learning Without Feedback
- Computer ScienceArXiv
- 2022
SoftHebb shows with a radically different approach from BP that Deep Learning over few layers may be plausible in the brain and increases the accuracy of bio-plausible machine learning.
Biologically-Motivated Deep Learning Method using Hierarchical Competitive Learning
- Computer ScienceArXiv
- 2020
A novel biologically-motivated learning method for deep convolutional neural networks (CNNs) which only requires forward propagating signals as a pre-training method for CNNs and achieves a state-of-the-art performance as a biologically-Motivated method in the ImageNet experiment.
Hebbian Semi-Supervised Learning in a Sample Efficiency Setting
- Computer ScienceNeural Networks
- 2021