Hebbian Learning Meets Deep Convolutional Neural Networks

  title={Hebbian Learning Meets Deep Convolutional Neural Networks},
  author={Giuseppe Amato and Fabio Carrara and F. Falchi and Claudio Gennaro and Gabriele Lagani},
  booktitle={International Conference on Image Analysis and Processing},
Neural networks are said to be biologically inspired since they mimic the behavior of real neurons. However, several processes in state-of-the-art neural networks, including Deep Convolutional Neural Networks (DCNN), are far from the ones found in animal brains. One relevant difference is the training process. In state-of-the-art artificial neural networks, the training process is based on backpropagation and Stochastic Gradient Descent (SGD) optimization. However, studies in neuroscience… 

Training Convolutional Neural Networks With Hebbian Principal Component Analysis

The HPCA variant that is explored is used to train Convolutional Neural Networks in order to extract relevant features from the CIFAR-10 image dataset, and further improves the previous results, motivating further interest towards biologically plausible learning algorithms.

Comparing the performance of Hebbian against backpropagation learning using convolutional neural networks

The results suggest that Hebbian learning is generally suitable for training early feature extraction layers, or to retrain higher network layers in fewer training epochs than backprop.

Convolutional Neural Networks with Hebbian-Based Rules in Online Transfer Learning

A combined technique of using pre-trained convolutional layers and a final classification using Hebbian-based rules (Basic Hebb, Covariance, Oja, and BCM), which suggests that this combined strategy might be useful to design Online Machine Learning Algorithms for Image Classification.

Hebbian learning with gradients: Hebbian convolutional neural networks with modern deep learning frameworks

It is hypothesized that more advanced techniques (dynamic stimuli, trace learning, feedback connections, etc.), together with the massive computational boost offered by modern deep learning frameworks, could greatly improve the performance and biological relevance of multi-layer Hebbian networks.

Multi-layer Hebbian networks with modern deep learning frameworks

It is hypothesized that more advanced techniques (dynamic stimuli, trace learning, feedback connections, etc.), together with the massive computational boost offered by modern deep learning frameworks, could greatly improve the performance and biological relevance of multi-layer Hebbian networks.

Neuro-Inspired Deep Neural Networks with Sparse, Strong Activations

A promising neuro-inspired approach to DNNs with sparser and stronger activations, which leads to sparser activations and exhibits more robustness to noise and adversarial perturbations.

Neuromodulated Dopamine Plastic Networks for Heterogeneous Transfer Learning with Hebbian Principle

The proposed NDHTL algorithm can enhance transfer learning efficiency compared to existing methods as synaptic plasticity controlled by dopamine signals in transfer learning to classify images using source-target datasets.

Hebbian Deep Learning Without Feedback

SoftHebb shows with a radically different approach from BP that Deep Learning over few layers may be plausible in the brain and increases the accuracy of bio-plausible machine learning.

Layerwise Hebbian/anti-Hebbian (HaH) Learning In Deep Networks: A Neuro-inspired Approach To Robustness

We propose a neuro-inspired approach for en-gineering robustness into deep neural networks (DNNs), in which end-to-end cost functions are supplemented with layer-wise costs promoting Hebbian (“fire

FastHebb: Scaling Hebbian Training of Deep Neural Networks to ImageNet Level

FastHebb is presented, an e-cient and scalable solution for Hebbian learning which achieves higher e-ciency by merging together update computation and aggregation over a batch of inputs, and leveraging efficient matrix multiplication algorithms on GPU.



Stable spike-timing dependent plasticity rule for multilayer unsupervised and supervised learning

This work presents a low-cost, simplified, yet stable STDP rule for layer-wise unsupervised and supervised training of a multilayer feed-forward SNN, proposes to approximate Bayesian neuron using Stochastic Integrate and Fire (SIF) neuron model and introduces a supervised learning approach using teacher neurons to train the classification layer with one neuron per class.

A Hebbian/Anti-Hebbian Neural Network for Linear Subspace Learning: A Derivation from Multidimensional Scaling of Streaming Data

A biologically plausible network for subspace learning on streaming data is derived by minimizing a principled cost function by adopting a multidimensional scaling cost function for streaming data and relying only on biologically plausible Hebbian and anti-Hebbian local learning rules.

Learning Sparse, Distributed Representations using the Hebbian Principle

This paper proposes an unsupervised algorithm, termed Adaptive Hebbian Learning (AHL), and illustrates the distributed nature of the learned representations via output entropy computations for synthetic data, and demonstrates superior performance in training a deep convolutional net on standard image datasets.

ImageNet classification with deep convolutional neural networks

A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.

Online Representation Learning with Multi-layer Hebbian Networks for Image Classification Tasks

This study introduces a novel multi-layer Hebbian network trained by a rule derived from a non-negative classical multidimensional scaling cost-function that is compared to that of other fully unsupervised learning algorithms.

ReSuMe-New Supervised Learning Method for Spiking Neural Networks

T thorough analysis of the ReSuMe method reveals its suitability not only to the task of movement control, but also to other real-life applicatio ns including modeling, identification and control of diverse non-statio nary, nonlinear objects.

Learning Multiple Layers of Features from Tiny Images

It is shown how to train a multi-layer generative model that learns to extract meaningful features which resemble those found in the human visual cortex, using a novel parallelization algorithm to distribute the work among multiple machines connected on a network.

Online Representation Learning with Single and Multi-layer Hebbian Networks for Image Classification

A new class of Hebbian-like and local unsupervised learning rules for neural networks that minimise a similarity matching cost-function are developed and applied to both single and multi-layer architectures, suggesting its validity in the design of a newclass of compact, online learning networks.

Feature discovery by competitive learning

Self-organization of orientation sensitive cells in the striate cortex

A nerve net model for the visual cortex of higher vertebrates is presented. A simple learning procedure is shown to be sufficient for the organization of some essential functional properties of