Locally Connected Spiking Neural Networks for Unsupervised Feature Learning

@article{Saunders2019LocallyCS,
  title={Locally Connected Spiking Neural Networks for Unsupervised Feature Learning},
  author={Daniel J. Saunders and Devdhar Patel and Hananel Hazan and Hava T. Siegelmann and Robert Thijs Kozma},
  journal={Neural networks : the official journal of the International Neural Network Society},
  year={2019},
  volume={119},
  pages={
          332-340
        }
}

Figures and Tables from this paper

Spiking Inception Module for Multi-layer Unsupervised Spiking Neural Networks
TLDR
The proposed Spiking Inception (Sp-Inception) module is trained through STDP-based competitive learning and outperforms the baseline modules on learning capability, learning efficiency, and robustness, and reaches state-of-the-art results on the MNIST dataset among the existing unsupervised SNNs.
BioLCNet: Reward-modulated Locally Connected Spiking Neural Networks
TLDR
This work proposes a reward-modulated locally connected spiking neural network, BioLCNet, for visual learning tasks and assesses the robustness of the rewarding mechanism to varying target responses in a classical conditioning experiment.
Quantized STDP-based online-learning spiking neural network
TLDR
A spike-timing-dependent plasticity (STDP)-based weight-quantized/binarized online-learning spiking neural network (SNN), which uses bio-plausible integrate-and-fire neuron and conductance-based synapse as the basic building blocks and realizes online learning by STDP and winner-take-all (WTA) mechanism is reported.
SPA: Stochastic Probability Adjustment for System Balance of Unsupervised SNNs
TLDR
An information theory-inspired system called Stochastic Probability Adjustment (SPA), which maps the synapses and neurons of SNNs into a probability space where a neuron and all connected pre-synapses are represented by a cluster.
BioSNet: A Fast-Learning and High-Robustness Unsupervised Biomimetic Spiking Neural Network
TLDR
In BioSNet, a new biomimetic spiking neuron model named MRON inspired by 'recognition memory' in the human brain is proposed, an efficient and strong unsupervised SNN is proposed with high biological plausibility to handle image classification tasks, and the traditional voting mechanism is extended to the Vote-for-All decoding layer to reduce information loss during decoding.
CSNNs: Unsupervised, Backpropagation-Free Convolutional Neural Networks for Representation Learning
TLDR
This work replaces the learning of traditional convolutional layers from CNNs with the competitive learning procedure of SOMs and simultaneously learns local masks between those layers with separate Hebbian-like learning rules to overcome the problem of disentangling factors of variation when filters are learned through clustering.
On the Self-Repair Role of Astrocytes in STDP Enabled Unsupervised SNNs
TLDR
The degree of self-repair that can be enabled in such networks with varying degree of faults ranging from 50 to 90% is characterized and the proposal is evaluated on the MNIST and Fashion-MNIST datasets.
SpikeDyn: A Framework for Energy-Efficient Spiking Neural Networks with Continual and Unsupervised Learning Capabilities in Dynamic Environments
TLDR
The proposed SpikeDyn is a comprehensive framework for energy-efficient SNNs with continual and unsupervised learning capabilities in dynamic environments, for both the training and inference phases, and reduces the energy consumption on average by 51% for training and by 37% for inference.
...
...

References

SHOWING 1-10 OF 38 REFERENCES
STDP-based spiking deep convolutional neural networks for object recognition
TLDR
The results suggest that the combination of STDP with latency coding may be a key to understanding the way that the primate visual system learns, its remarkable processing speed and its low energy consumption.
Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning
TLDR
This paper proposes a pre-training scheme using biologically plausible unsupervised learning, namely Spike-Timing-Dependent-Plasticity (STDP), in order to better initialize the parameters in multi-layer systems prior to supervised optimization.
Recurrent Spiking Neural Network Learning Based on a Competitive Maximization of Neuronal Activity
TLDR
The basic principle of the proposed algorithm is believed to be practically applicable to the construction of much more complicated and diverse task solving SNNs and is referred to as “Family-Engaged Execution and Learning of Induced Neuron Groups”, or FEELING.
Deep Learning With Spiking Neurons: Opportunities and Challenges
TLDR
This review addresses the opportunities that deep spiking networks offer and investigates in detail the challenges associated with training SNNs in a way that makes them competitive with conventional deep learning, but simultaneously allows for efficient mapping to hardware.
Unsupervised regenerative learning of hierarchical features in Spiking Deep Networks for object recognition
  • P. Panda, K. Roy
  • Computer Science
    2016 International Joint Conference on Neural Networks (IJCNN)
  • 2016
TLDR
A spike-based unsupervised regenerative learning scheme to train Spiking Deep Networks (SpikeCNN) for object recognition problems using biologically realistic leaky integrate-and-fire neurons resulting in computationally efficient learning is presented.
Deep Learning in Spiking Neural Networks
Unsupervised learning of digit recognition using spike-timing-dependent plasticity
TLDR
A SNN for digit recognition which is based on mechanisms with increased biological plausibility, i.e., conductance-based instead of current-based synapses, spike-timing-dependent plasticity with time-dependent weight change, lateral inhibition, and an adaptive spiking threshold is presented.
Unsupervised Learning with Self-Organizing Spiking Neural Networks
TLDR
A hybridization of self-organized map properties with spiking neural networks that retain many of the features of SOMs is presented, and using the optimal choice of parameters, this approach produces improvements over state-of-art spiking Neural networks.
A Minimal Spiking Neural Network to Rapidly Train and Classify Handwritten Digits in Binary and 10-Digit Tasks
TLDR
The simulation results show that although the proposed SNN is trained quickly without error-feedbacks in a few number of iterations, it results in desirable performance in the binary classification (0 and 1) and gives acceptable recognition accuracy in 10-digit classification in comparison with statistical methods such as support vector machine (SVM) and multi-perceptron neural network.
...
...