Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike Hybrid Input Encoding

  title={Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike Hybrid Input Encoding},
  author={Gourav Datta and Souvik Kundu and Peter A. Beerel},
  journal={2021 International Joint Conference on Neural Networks (IJCNN)},
  • G. Datta, Souvik Kundu, P. Beerel
  • Published 2021
  • Computer Science
  • 2021 International Joint Conference on Neural Networks (IJCNN)
Spiking Neural Networks (SNNs) have emerged as an attractive alternative to traditional deep learning frameworks, since they provide higher computational efficiency in event driven neuromorphic hardware. However, the state-of-the-art (SOTA) SNNs suffer from high inference latency, resulting from inefficient input encoding and training techniques. The most widely used input coding schemes, such as Poisson based rate-coding, do not leverage the temporal learning capabilities of SNNs. This paper… Expand

Figures and Tables from this paper

HYPER-SNN: Towards Energy-efficient Quantized Deep Spiking Neural Networks for Hyperspectral Image Classification
This work proposes the use of Spiking Neural Networks (SNNs) that are generated from iso-architecture CNNs and trained with quantization-aware gradient descent to optimize their weights, membrane leak, and firing thresholds. Expand


Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation
The proposed training methodology converges in less than 20 epochs of spike-based backpropagation for most standard image classification datasets, thereby greatly reducing the training complexity compared to training SNNs from scratch. Expand
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures
This work proposes an approximate derivative method that accounts for the leaky behavior of LIF neurons that enables training deep convolutional SNNs directly (with input spike events) using spike-based backpropagation and analyze sparse event-based computations to demonstrate the efficacy of the proposed SNN training method for inference operation in the spiking domain. Expand
Training Deep Spiking Neural Networks Using Backpropagation
A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials. Expand
Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning
This paper proposes a pre-training scheme using biologically plausible unsupervised learning, namely Spike-Timing-Dependent-Plasticity (STDP), in order to better initialize the parameters in multi-layer systems prior to supervised optimization. Expand
Deep Learning With Spiking Neurons: Opportunities and Challenges
This review addresses the opportunities that deep spiking networks offer and investigates in detail the challenges associated with training SNNs in a way that makes them competitive with conventional deep learning, but simultaneously allows for efficient mapping to hardware. Expand
Direct Training for Spiking Neural Networks: Faster, Larger, Better
This work proposes a neuron normalization technique to adjust the neural selectivity and develops a direct learning algorithm for deep SNNs and presents a Pytorch-based implementation method towards the training of large-scale Snns. Expand
Spike-Thrift: Towards Energy-Efficient Deep Spiking Neural Networks by Limiting Spiking Activity via Attention-Guided Compression
This paper proposes a novel two-step SNN compression technique to reduce their spiking activity while maintaining accuracy that involves compressing specifically-designed artificial neural networks (ANNs) that are then converted into the target SNNs. Expand
Deep Learning in Spiking Neural Networks
The emerging picture is that SNNs still lag behind ANNs in terms of accuracy, but the gap is decreasing, and can even vanish on some tasks, while SNN's typically require many fewer operations and are the better candidates to process spatio-temporal data. Expand
T2FSNN: Deep Spiking Neural Networks with Time-to-first-spike Coding
T2FSNN is presented, which introduces the concept of time-to-first-spike coding into deep SNNs using the kernel-based dynamic threshold and dendrite to overcome the aforementioned drawback and proposes gradient-based optimization and early firing methods to further increase the efficiency of the T1FSNN. Expand
Training Deep Spiking Neural Networks
This work directly train deep SNNs using backpropagation with surrogate gradient and finds that due to implicitly recurrent nature of feed forward SNN's the exploding or vanishing gradient problem severely hinders their training and shows that this problem can be solved by tuning the surrogate gradient function. Expand