Hebbian Learning Meets Deep Convolutional Neural Networks

@inproceedings{Amato2019HebbianLM,
  title={Hebbian Learning Meets Deep Convolutional Neural Networks},
  author={Giuseppe Amato and Fabio Carrara and F. Falchi and Claudio Gennaro and Gabriele Lagani},
  booktitle={ICIAP},
  year={2019}
}
Neural networks are said to be biologically inspired since they mimic the behavior of real neurons. However, several processes in state-of-the-art neural networks, including Deep Convolutional Neural Networks (DCNN), are far from the ones found in animal brains. One relevant difference is the training process. In state-of-the-art artificial neural networks, the training process is based on backpropagation and Stochastic Gradient Descent (SGD) optimization. However, studies in neuroscience… Expand
Training Convolutional Neural Networks With Hebbian Principal Component Analysis
TLDR
The HPCA variant that is explored is used to train Convolutional Neural Networks in order to extract relevant features from the CIFAR-10 image dataset, and further improves the previous results, motivating further interest towards biologically plausible learning algorithms. Expand
Convolutional Neural Networks with Hebbian-Based Rules in Online Transfer Learning
TLDR
A combined technique of using pre-trained convolutional layers and a final classification using Hebbian-based rules (Basic Hebb, Covariance, Oja, and BCM), which suggests that this combined strategy might be useful to design Online Machine Learning Algorithms for Image Classification. Expand
Multi-layer Hebbian networks with modern deep learning frameworks
TLDR
It is hypothesized that more advanced techniques (dynamic stimuli, trace learning, feedback connections, etc.), together with the massive computational boost offered by modern deep learning frameworks, could greatly improve the performance and biological relevance of multi-layer Hebbian networks. Expand
Neuromodulated Dopamine Plastic Networks for Heterogeneous Transfer Learning with Hebbian Principle
TLDR
The proposed NDHTL algorithm can enhance transfer learning efficiency compared to existing methods as synaptic plasticity controlled by dopamine signals in transfer learning to classify images using source-target datasets. Expand
Improvement of Heterogeneous Transfer Learning Efficiency by Using Hebbian Learning Principle
TLDR
Experimental results show that the proposed HTL algorithm can improve the performance of transfer learning, especially in the case of a heterogeneous source and target dataset. Expand
Hebbian Semi-Supervised Learning in a Sample Efficiency Setting
TLDR
The results show that, in regimes where the number of available labeled samples is low, the semi-supervised approach outperforms full backpropagation in almost all the cases. Expand
Advances in Soft Computing: 19th Mexican International Conference on Artificial Intelligence, MICAI 2020, Mexico City, Mexico, October 12–17, 2020, Proceedings, Part I
TLDR
This work is interested in exploring how an RNN based on Long-Short Term Memory (LSTM) units behaves in a classification problem when the dataset of sequences are organized in different order and lengths, and it is brought to light that good accuracies can be achieved for different sequences configurations. Expand
Improving Reader Motivation with Machine Learning
This thesis focuses on the problem of increasing reading motivation with machine learning (ML). The act of reading is central to modern human life, and there is much to be gained by improving theExpand
Hebbian learning with gradients: Hebbian convolutional neural networks with modern deep learning frameworks
TLDR
It is hypothesized that more advanced techniques (dynamic stimuli, trace learning, feedback connections, etc.), together with the massive computational boost offered by modern deep learning frameworks, could greatly improve the performance and biological relevance of multi-layer Hebbian networks. Expand

References

SHOWING 1-10 OF 23 REFERENCES
Stable spike-timing dependent plasticity rule for multilayer unsupervised and supervised learning
TLDR
This work presents a low-cost, simplified, yet stable STDP rule for layer-wise unsupervised and supervised training of a multilayer feed-forward SNN, proposes to approximate Bayesian neuron using Stochastic Integrate and Fire (SIF) neuron model and introduces a supervised learning approach using teacher neurons to train the classification layer with one neuron per class. Expand
A Hebbian/Anti-Hebbian Neural Network for Linear Subspace Learning: A Derivation from Multidimensional Scaling of Streaming Data
TLDR
A biologically plausible network for subspace learning on streaming data is derived by minimizing a principled cost function by adopting a multidimensional scaling cost function for streaming data and relying only on biologically plausible Hebbian and anti-Hebbian local learning rules. Expand
Learning Sparse, Distributed Representations using the Hebbian Principle
TLDR
This paper proposes an unsupervised algorithm, termed Adaptive Hebbian Learning (AHL), and illustrates the distributed nature of the learned representations via output entropy computations for synthetic data, and demonstrates superior performance in training a deep convolutional net on standard image datasets. Expand
ImageNet classification with deep convolutional neural networks
TLDR
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective. Expand
Online Representation Learning with Multi-layer Hebbian Networks for Image Classification Tasks
TLDR
This study introduces a novel multi-layer Hebbian network trained by a rule derived from a non-negative classical multidimensional scaling cost-function that is compared to that of other fully unsupervised learning algorithms. Expand
ReSuMe-New Supervised Learning Method for Spiking Neural Networks
In this report I introduce ReSuMe a new supervised learning method for Spiking Neural Networks. The research on ReSuMe has been primarily motivated by the need of inventing an efficient learni ngExpand
Learning Multiple Layers of Features from Tiny Images
TLDR
It is shown how to train a multi-layer generative model that learns to extract meaningful features which resemble those found in the human visual cortex, using a novel parallelization algorithm to distribute the work among multiple machines connected on a network. Expand
Online Representation Learning with Single and Multi-layer Hebbian Networks for Image Classification
TLDR
A new class of Hebbian-like and local unsupervised learning rules for neural networks that minimise a similarity matching cost-function are developed and applied to both single and multi-layer architectures, suggesting its validity in the design of a newclass of compact, online learning networks. Expand
Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectors
  • S. Grossberg
  • Mathematics, Computer Science
  • Biological Cybernetics
  • 2004
TLDR
It is shown how experience can retune feature detectors to respond to a prescribed convex set of spatial patterns, and a classification of adult feature detector properties in terms of a small number of functional principles is suggested. Expand
Feature discovery by competitive learning
TLDR
This paper shows how a set of feature detectors which capture important aspects of the set of stimulus input patterns are discovered and how these feature detectors form the basis of a multilayer system that serves to learn categorizations of stimulus sets which are not linearly separable. Expand
...
1
2
3
...