Hebbian Semi-Supervised Learning in a Sample Efficiency Setting

  title={Hebbian Semi-Supervised Learning in a Sample Efficiency Setting},
  author={Gabriele Lagani and F. Falchi and Claudio Gennaro and Giuseppe Amato},
  journal={Neural networks : the official journal of the International Neural Network Society},
  • Gabriele Lagani, F. Falchi, +1 author G. Amato
  • Published 16 March 2021
  • Computer Science, Medicine
  • Neural networks : the official journal of the International Neural Network Society
We propose to address the issue of sample efficiency, in Deep Convolutional Neural Networks (DCNN), with a semi-supervised training strategy that combines Hebbian learning with gradient descent: all internal layers (both convolutional and fully connected) are pre-trained using an unsupervised approach based on Hebbian learning, and the last fully connected layer (the classification layer) is trained using Stochastic Gradient Descent (SGD). In fact, as Hebbian learning is an unsupervised… 

Figures and Tables from this paper


Augmenting Supervised Neural Networks with Unsupervised Objectives for Large-scale Image Classification
This work demonstrates that the intermediate activations of pretrained large-scale classification networks preserve almost all the information of input images except a portion of local spatial details, and investigates joint supervised and unsupervised learning in a large- scale setting by augmenting existing neural networks with decoding pathways for reconstruction.
Convolutional Neural Networks with Hebbian-Based Rules in Online Transfer Learning
A combined technique of using pre-trained convolutional layers and a final classification using Hebbian-based rules (Basic Hebb, Covariance, Oja, and BCM), which suggests that this combined strategy might be useful to design Online Machine Learning Algorithms for Image Classification.
Improvement of Heterogeneous Transfer Learning Efficiency by Using Hebbian Learning Principle
Experimental results show that the proposed HTL algorithm can improve the performance of transfer learning, especially in the case of a heterogeneous source and target dataset.
Semi-supervised Learning with Ladder Networks
This work builds on top of the Ladder network proposed by Valpola which is extended by combining the model with supervision and shows that the resulting model reaches state-of-the-art performance in semi-supervised MNIST and CIFAR-10 classification in addition to permutation-invariant MNIST classification with all labels.
Bottom-up Deep Learning using the Hebbian Principle Aseem
The “fire together, wire together” Hebbian learning model is a central principle in neuroscience, but, surprisingly, it has found limited applicability in modern machine learning. In this paper, we
Hebbian Learning Meets Deep Convolutional Neural Networks
The use of the Hebbian learning rule is investigated when training Deep Neural Networks for image classification by proposing a novel weight update rule for shared kernels in DCNNs.
ImageNet classification with deep convolutional neural networks
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Deep Residual Learning for Image Recognition
This work presents a residual learning framework to ease the training of networks that are substantially deeper than those used previously, and provides comprehensive empirical evidence showing that these residual networks are easier to optimize, and can gain accuracy from considerably increased depth.
Transfer Learning for Image Classification Using Hebbian Plasticity Principles
Extensive experiments verify that HTL, using synaptic plastic behaviour in heterogeneous transfer learning task does better than the standard state of the art methods of transfer learning on the cross-domain image classification task.
Online Representation Learning with Single and Multi-layer Hebbian Networks for Image Classification
A new class of Hebbian-like and local unsupervised learning rules for neural networks that minimise a similarity matching cost-function are developed and applied to both single and multi-layer architectures, suggesting its validity in the design of a newclass of compact, online learning networks.