Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning

@article{Lee2018TrainingDS,
  title={Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning},
  author={Chankyu Lee and Priyadarshini Panda and Gopalakrishnan Srinivasan and Kaushik Roy},
  journal={Frontiers in Neuroscience},
  year={2018},
  volume={12}
}
Spiking Neural Networks (SNNs) are fast becoming a promising candidate for brain-inspired neuromorphic computing because of their inherent power efficiency and impressive inference accuracy across several cognitive tasks such as image classification and speech recognition. The recent efforts in SNNs have been focused on implementing deeper networks with multiple hidden layers to incorporate exponentially more difficult functional representations. In this paper, we propose a pre-training scheme… 
ReStoCNet: Residual Stochastic Binary Convolutional Spiking Neural Network for Memory-Efficient Neuromorphic Computing
TLDR
ReStoCNet, a residual stochastic multilayer convolutional Spiking Neural Network composed of binary kernels, is proposed to reduce the synaptic memory footprint and enhance the computational efficiency of SNNs for complex pattern recognition tasks.
STDP-based Unsupervised Feature Learning using Convolution-over-time in Spiking Neural Networks for Energy-Efficient Neuromorphic Computing
TLDR
This work proposes Spike Timing Dependent Plasticity-based unsupervised feature learning using convolution-over-time in Spiking Neural Network (SNN), and uses shared weight kernels that are convolved with the input patterns over time to encode representative input features, thereby improving the sparsity as well as the robustness of the learning model.
Training Deep Convolutional Spiking Neural Networks With Spike Probabilistic Global Pooling
TLDR
This work presents the spike probabilistic global pooling (SPGP) method, which aims to remove the difficulty of too many trainable parameters brought by multiple layers in the training process, which can reduce the risk of overfitting and get better performance for deep SNNs (DSNNs).
Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike Hybrid Input Encoding
TLDR
This paper presents a training framework for low-latency energyefficient SNNs that uses a hybrid encoding scheme at the input layer in which the analog pixel values of an image are directly applied during the first timestep and a novel variant of spike temporal coding is used during subsequent timesteps.
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training
TLDR
A new learning algorithm, called SSTDP, which bridges the gap between backpropagation (BP)-based learning and spike-time-dependent plasticity (STDP)-based learning to train SNNs efficiently and can lower the possibility of vanishing spikes in BP training and reduce the number of time steps to reduce network latency.
An Unsupervised Spiking Neural Network Inspired By Biologically Plausible Learning Rules and Connections
TLDR
An adaptive synaptic plasticity is designed, and the adaptive threshold balance is introduced as the neuron plasticity to enrich the representation ability of SNNs and introduce an adaptive lateral inhibitory connection to dynamically adjust the spikes balance to help the network learn richer features.
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures
TLDR
This work proposes an approximate derivative method that accounts for the leaky behavior of LIF neurons that enables training deep convolutional SNNs directly (with input spike events) using spike-based backpropagation and analyze sparse event-based computations to demonstrate the efficacy of the proposed SNN training method for inference operation in the spiking domain.
Biologisch Plausibles Lernen mit Spiking Neural Networks
TLDR
The investigation of this alternative network architecture based on the principle of Spike-Timing Dependent Plasticity together with STDP-based learing rules represents an opportunity to derive new insights about the robustness and energy-efficient implementations of neural network in specialized hardware and also in general.
Exploring Optimized Spiking Neural Network Architectures for Classification Tasks on Embedded Platforms
TLDR
This work refines the SNN based dropout technique with surrogate gradient descent, and proposes a customized model (VGG, ResNet) architecture to train deep convolutional spiking neural networks in a customized layer architecture similar to deep artificial neural networks.
...
...

References

SHOWING 1-10 OF 55 REFERENCES
Deep Spiking Convolutional Neural Network Trained With Unsupervised Spike-Timing-Dependent Plasticity
TLDR
A deep SpiCNN, consisting of two convolutional layers trained using the unsupervised Convolutional STDP learning methodology, achieved classification accuracies of 91.1% and 97.6%, respectively, for inferring handwritten digits from the MNIST data set and a subset of natural images from the Caltech data set.
Multi-layer unsupervised learning in a spiking convolutional neural network
  • A. Tavanaei, A. Maida
  • Computer Science
    2017 International Joint Conference on Neural Networks (IJCNN)
  • 2017
TLDR
This paper explores a novel, bio-inspired spiking convolutional neural network (CNN) that is trained in a greedy, layer-wise fashion, enabling it to support a multi-layer learning architecture.
Bio-Inspired Spiking Convolutional Neural Network using Layer-wise Sparse Coding and STDP Learning
TLDR
A novel bio-inspired spiking CNN that is trained in a greedy, layer-wise fashion and Experimental results show that the convolutional layer is stack-admissible, enabling it to support a multi-layer learning.
Unsupervised regenerative learning of hierarchical features in Spiking Deep Networks for object recognition
  • P. Panda, K. Roy
  • Computer Science
    2016 International Joint Conference on Neural Networks (IJCNN)
  • 2016
TLDR
A spike-based unsupervised regenerative learning scheme to train Spiking Deep Networks (SpikeCNN) for object recognition problems using biologically realistic leaky integrate-and-fire neurons resulting in computationally efficient learning is presented.
Convolutional Spike Timing Dependent Plasticity based Feature Learning in Spiking Neural Networks
TLDR
This work presents convolutional spike timing dependent plasticity based feature learning with biologically plausible leaky-integrate-and-fire neurons in Spiking Neural Networks (SNNs) and uses shared weight kernels that are trained to encode representative features underlying the input patterns thereby improving the sparsity as well as the robustness of the learning model.
Training Deep Spiking Neural Networks Using Backpropagation
TLDR
A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials.
Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition
TLDR
A novel approach for converting a deep CNN into a SNN that enables mapping CNN to spike-based hardware architectures and evaluates the resulting SNN on publicly available Defense Advanced Research Projects Agency (DARPA) Neovision2 Tower and CIFAR-10 datasets and shows similar object recognition accuracy as the original CNN.
Spiking Deep Networks with LIF Neurons
TLDR
This work demonstrates that biologically-plausible spiking LIF neurons can be integrated into deep networks can perform as well as other spiking models (e.g. integrate-and-fire), and provides new methods for training deep networks to run on neuromorphic hardware.
Supervised Learning Based on Temporal Coding in Spiking Neural Networks
  • H. Mostafa
  • Computer Science
    IEEE Transactions on Neural Networks and Learning Systems
  • 2018
TLDR
This work shows that in a feedforward spiking network that uses a temporal coding scheme where information is encoded in spike times instead of spike rates, the network input–output relation is differentiable almost everywhere and this relation is piecewise linear after a transformation of variables.
...
...