EXODUS: Stable and Efficient Training of Spiking Neural Networks

@article{Bauer2022EXODUSSA,
  title={EXODUS: Stable and Efficient Training of Spiking Neural Networks},
  author={F. Bauer and Gregor Lenz and Saeid Haghighatshoar and Sadique Sheik},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.10242}
}
Spiking Neural Networks (SNNs) are gaining significant traction in machine learning tasks where energy-efficiency is of utmost importance. Training such networks using the state-of-the-art back-propagation through time (BPTT) is, however, very time-consuming. Previous work by Shrestha and Orchard [2018] employs an efficient GPU-accelerated back-propagation algorithm called SLAYER, which speeds up training considerably. SLAYER, however, does not take into account the neuron reset mechanism while… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 26 REFERENCES
Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization
TLDR
Novel algorithmic techniques of modifying the SNN configuration with backward residual connections, stochastic softmax, and hybrid artificial-and-spiking neuronal activations to improve the learning ability of the training methodologies to yield competitive accuracy, while, yielding large efficiency gains over their artificial counterparts.
SLAYER: Spike Layer Error Reassignment in Time
TLDR
A new general back Propagation mechanism for learning synaptic weights and axonal delays which overcomes the problem of non-differentiability of the spike function and uses a temporal credit assignment policy for backpropagating error to preceding layers is introduced.
Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks
TLDR
This article elucidates step-by-step the problems typically encountered when training SNNs and guides the reader through the key concepts of synaptic plasticity and data-driven learning in the spiking setting as well as introducing surrogate gradient methods, specifically, as a particularly flexible and efficient method to overcome the aforementioned challenges.
Fast and energy-efficient neuromorphic deep learning with first-spike times
TLDR
A rigorous derivation of a learning rule for such first-spike times in networks of leaky integrate-and-fire neurons, relying solely on input and output spike times are described, and it is shown how this mechanism can implement error backpropagation in hierarchical spiking networks.
The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks
TLDR
Two spike-based classification data sets are introduced, broadly applicable to benchmark both software and neuromorphic hardware implementations of spiking neural networks, and it is shown that leveraging spike timing information within these data sets is essential for good classification accuracy.
Deep learning incorporating biologically inspired neural dynamics and in-memory computing
TLDR
The biologically inspired dynamics of spiking neurons are incorporated into conventional recurrent neural network units and in-memory computing, and it is shown how this allows for accurate and energy-efficient deep learning.
Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning
TLDR
This paper proposes a pre-training scheme using biologically plausible unsupervised learning, namely Spike-Timing-Dependent-Plasticity (STDP), in order to better initialize the parameters in multi-layer systems prior to supervised optimization.
Convolutional networks for fast, energy-efficient neuromorphic computing
TLDR
This approach allows the algorithmic power of deep learning to be merged with the efficiency of neuromorphic processors, bringing the promise of embedded, intelligent, brain-inspired computing one step closer.
Spatial Properties of STDP in a Self-Learning Spiking Neural Network Enable Controlling a Mobile Robot
TLDR
A simple SNN equipped with a Hebbian rule in the form of spike-timing-dependent plasticity (STDP) is proposed and it is shown that a LEGO robot controlled by the SNN can exhibit classical and operant conditioning.
Unsupervised learning of digit recognition using spike-timing-dependent plasticity
TLDR
A SNN for digit recognition which is based on mechanisms with increased biological plausibility, i.e., conductance-based instead of current-based synapses, spike-timing-dependent plasticity with time-dependent weight change, lateral inhibition, and an adaptive spiking threshold is presented.
...
...