Activation Learning by Local Competitions

@article{Zhou2022ActivationLB,
  title={Activation Learning by Local Competitions},
  author={Hongchao Zhou},
  journal={ArXiv},
  year={2022},
  volume={abs/2209.13400}
}
The backpropagation that drives the success of deep learning is most likely different from the learning mechanism of the brain. In this paper, we develop a biology-inspired learning rule that discovers features by local competitions among neurons, following the idea of Hebb’s famous proposal. It is demonstrated that the unsupervised features learned by this local learning rule can serve as a pre-training model to improve the performance of some supervised learning tasks. More importantly, this… 

References

SHOWING 1-10 OF 67 REFERENCES

Unsupervised learning by competing hidden units

A learning algorithm is designed that utilizes global inhibition in the hidden layer and is capable of learning early feature detectors in a completely unsupervised way, and which is motivated by Hebb’s idea that change of the synapse strength should be local.

Hebbian Learning Meets Deep Convolutional Neural Networks

The use of the Hebbian learning rule is investigated when training Deep Neural Networks for image classification by proposing a novel weight update rule for shared kernels in DCNNs.

HebbNet: A Simplified Hebbian Learning Framework to do Biologically Plausible Learning

This work introduces a new Hebbian learning based neural network, called HebbNet, that includes an updated activation threshold and gradient sparsity to the first principles of Hebbia, and improves training dynamics by reducing the number of training epochs and making training a one-step process from a two- step process.

Multi-layer Hebbian networks with modern deep learning frameworks

It is hypothesized that more advanced techniques (dynamic stimuli, trace learning, feedback connections, etc.), together with the massive computational boost offered by modern deep learning frameworks, could greatly improve the performance and biological relevance of multi-layer Hebbian networks.

Local Unsupervised Learning for Image Analysis

The design of a local algorithm that can learn convolutional filters at scale on large image datasets and a successful transfer of learned representations between CIFAR-10 and ImageNet 32x32 datasets hint at the possibility that local unsupervised training might be a powerful tool for learning general representations (without specifying the task) directly from unlabeled data.

Biologically Inspired Feedforward Supervised Learning for Deep Self-Organizing Map Networks

A novel deep neural network and its supervised learning method that uses a feedforward supervisory signal that is inspired by the human visual system and performs human-like association-based learning without any backward error propagation is proposed.

Feature discovery by competitive learning

Hebbian Deep Learning Without Feedback

SoftHebb shows with a radically different approach from BP that Deep Learning over few layers may be plausible in the brain and increases the accuracy of bio-plausible machine learning.

Biologically-Motivated Deep Learning Method using Hierarchical Competitive Learning

A novel biologically-motivated learning method for deep convolutional neural networks (CNNs) which only requires forward propagating signals as a pre-training method for CNNs and achieves a state-of-the-art performance as a biologically-Motivated method in the ImageNet experiment.
...