Deep Learning With Spiking Neurons: Opportunities and Challenges

@article{Pfeiffer2018DeepLW,
  title={Deep Learning With Spiking Neurons: Opportunities and Challenges},
  author={Michael Pfeiffer and Thomas Pfeil},
  journal={Frontiers in Neuroscience},
  year={2018},
  volume={12}
}
Spiking neural networks (SNNs) are inspired by information processing in biology, where sparse and asynchronous binary signals are communicated and processed in a massively parallel fashion. SNNs on neuromorphic hardware exhibit favorable properties such as low power consumption, fast inference, and event-driven information processing. This makes them interesting candidates for the efficient implementation of deep neural networks, the method of choice for many machine learning tasks. In this… 

Figures from this paper

A Tandem Learning Rule for Efficient and Rapid Inference on Deep Spiking Neural Networks.
TLDR
The spike count is considered as the discrete neural representation and design ANN neuronal activation function that can effectively approximate the spike count of the coupled SNN in a tandem learning framework that consists of a SNN and an Artificial Neural Network that share weights.
A Hybrid Learning Rule for Efficient and Rapid Inference with Spiking Neural Networks
TLDR
A novel learning rule is proposed based on the hybrid neural network with shared weights, wherein a rate-based SNN is used during the forward propagation to determine precise spike counts and spike trains, and an equivalent ANN is used During error backpropagation to approximate the gradients for the coupled SNN.
Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike Hybrid Input Encoding
TLDR
This paper presents a training framework for low-latency energyefficient SNNs that uses a hybrid encoding scheme at the input layer in which the analog pixel values of an image are directly applied during the first timestep and a novel variant of spike temporal coding is used during subsequent timesteps.
Progressive Tandem Learning for Pattern Recognition with Deep Spiking Neural Networks
TLDR
A novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition, which is referred to as progressive tandem learning of deep SNNs, and opens up a myriad of opportunities for pervasive mobile and embedded devices with a limited power budget.
Temporal-Coded Deep Spiking Neural Network with Easy Training and Robust Performance
TLDR
It is shown that a deep temporal-coded SNN can be trained easily and directly over the benchmark datasets CIFAR10 and ImageNet, with testing accuracy within 1% of the DNN of equivalent size and architecture.
Can Deep Neural Networks be Converted to Ultra Low-Latency Spiking Neural Networks?
  • G. Datta, P. Beerel
  • Computer Science
    2022 Design, Automation & Test in Europe Conference & Exhibition (DATE)
  • 2022
TLDR
It is determined that SOTA conversion strategies cannot yield ultra low latency because they incorrectly assume that the DNN and SNN pre-activation values are uniformly distributed, and a new training algorithm is proposed that accurately captures these distributions, minimizing the error between the Dnn and converted SNN.
Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
TLDR
Novel algorithmic techniques of modifying the SNN configuration with backward residual connections, stochastic softmax, and hybrid artificial-and-spiking neuronal activations to improve the learning ability of the training methodologies to yield competitive accuracy, while, yielding large efficiency gains over their artificial counterparts.
Scaling Deep Spiking Neural Networks with Binary Stochastic Activations
TLDR
This work presents scalable deep spiking neural networks that achieve performance comparable to DNNs while achieving substantial energy benefit, and investigates extremely quantized version of these networks having binary weights and shows an energy benefit of 28x over full-precision neural networks.
Deep Spiking Neural Network with Spike Count based Learning Rule
TLDR
A novel spike-based learning rule for rate-coded deep SNNs, whereby the spike count of each neuron is used as a surrogate for gradient backpropagation is introduced, which allows direct deployment to the neuromorphic hardware and supports efficient inference.
...
...

References

SHOWING 1-10 OF 227 REFERENCES
Deep Learning in Spiking Neural Networks
Training Deep Spiking Neural Networks Using Backpropagation
TLDR
A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials.
Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware
TLDR
Surprisingly, it is found that short synaptic delays are sufficient to implement the dynamic (temporal) aspect of the RNN in the question classification task and the discretization of the neural activities is beneficial to the train-and-constrain approach.
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures
TLDR
A novel algorithmic technique is proposed for generating an SNN with a deep architecture with significantly better accuracy than the state-of-the-art, and its effectiveness on complex visual recognition problems such as CIFAR-10 and ImageNet is demonstrated.
Deep Spiking Convolutional Neural Network Trained With Unsupervised Spike-Timing-Dependent Plasticity
TLDR
A deep SpiCNN, consisting of two convolutional layers trained using the unsupervised Convolutional STDP learning methodology, achieved classification accuracies of 91.1% and 97.6%, respectively, for inferring handwritten digits from the MNIST data set and a subset of natural images from the Caltech data set.
Event-driven contrastive divergence for spiking neuromorphic systems
TLDR
This work presents an event-driven variation of CD to train a RBM constructed with Integrate & Fire neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms, and contributes to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.
Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks
TLDR
The results suggest that SNNs can be optimized to dramatically decrease the latency as well as the computation requirements for Deep Neural Networks, making them particularly attractive for applications like robotics, where real-time restrictions to produce outputs and low energy budgets are common.
Spiking Deep Residual Network
TLDR
This work is the first time to build a SNN deeper than 40, with comparable performance to ANNs on a large-scale dataset, and a shortcut conversion model to appropriately scale continuous-valued activations to match firing rates in SNN.
Spiking Deep Networks with LIF Neurons
TLDR
This work demonstrates that biologically-plausible spiking LIF neurons can be integrated into deep networks can perform as well as other spiking models (e.g. integrate-and-fire), and provides new methods for training deep networks to run on neuromorphic hardware.
Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing
TLDR
The method for converting an ANN into an SNN enables low-latency classification with high accuracies already after the first output spike, and compared with previous SNN approaches it yields improved performance without increased training time.
...
...