The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks

@article{Cramer2022TheHS,
  title={The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks},
  author={Benjamin Cramer and Yannik Stradmann and Johannes Schemmel and Friedemann Zenke},
  journal={IEEE Transactions on Neural Networks and Learning Systems},
  year={2022},
  volume={33},
  pages={2744-2757}
}
Spiking neural networks are the basis of versatile and power-efficient information processing in the brain. Although we currently lack a detailed understanding of how these networks compute, recently developed optimization techniques allow us to instantiate increasingly complex functional spiking neural networks in-silico. These methods hold the promise to build more efficient non-von-Neumann computing hardware and will offer new vistas in the quest of unraveling brain circuit function. To… 
Neuromorphic Algorithm-hardware Codesign for Temporal Pattern Learning
TLDR
This work derives an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns, and develops a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity.
Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks
TLDR
It is demonstrated how a novel surrogate gradient combined with recurrent networks of tunable and adaptive spiking neurons yields state-of-the-art for SNNs on challenging benchmarks in the time-domain, like speech and gesture recognition.
SpikE: spike-based embeddings for multi-relational graph data
  • D. Dold, J. Garrido
  • Computer Science
    2021 International Joint Conference on Neural Networks (IJCNN)
  • 2021
TLDR
A spike-based algorithm where nodes in a graph are represented by single spike times of neuron populations and relations as spike time differences between populations is proposed, compatible with recently proposed frameworks for training spiking neural networks.
Accelerating spiking neural network training
TLDR
This work proposes a new technique for directly training single-spike-per-neuron SNNs which eliminates all sequential computation and relies exclusively on vectorised operations and manages to solve certain tasks with over a 95.68% reduction in spike counts relative to a conventionally trained SNN.
EXODUS: Stable and Efficient Training of Spiking Neural Networks
TLDR
An algorithm called EXODUS is designed, that accounts for the neuron reset mechanism and applies the Implicit Function Theorem to calculate the correct gradients (equivalent to those computed by BPTT), which eliminates the need for ad-hoc scaling of gradients, thus reducing the training complexity tremendously.
The fine line between dead neurons and sparsity in binarized spiking neural networks
TLDR
This paper proposes the use of ‘threshold annealing’ as a warm-up method for firing thresholds and shows it enables the propagation of spikes across multiple layers where neurons would otherwise cease to fire, and in doing so, achieves highly competitive results on four diverse datasets, despite using binarized weights.
WaveSense: Efficient Temporal Convolutions with Spiking Neural Networks for Keyword Spotting
TLDR
The results show that the proposed network beats the state of the art of other spiking neural networks and reaches near state-of-the-art performance of artificial neural networks such as CNNs and LSTMs.
Temporal-wise Attention Spiking Neural Networks for Event Streams Classification
TLDR
A temporal-wise attention SNN (TA-SNN) model to learn frame-based representation for processing event streams to improve the accuracy of event streams classification tasks and study the impact of multiple-scale temporal resolutions for frame- based representation.
Hardware calibrated learning to compensate heterogeneity in analog RRAM-based Spiking Neural Networks
TLDR
It is shown that by taking into account the measured heterogeneity characteristics in the offchip learning phase, the NHC SNN self-corrects its hardware nonidealities and learns to solve benchmark tasks with high accuracy.
Sparse Spiking Gradient Descent
TLDR
This work presents the first sparse SNN backpropagation algorithm which achieves the same or better accuracy as current state of the art methods while being significantly faster and more memory efficient.
...
...

References

SHOWING 1-10 OF 88 REFERENCES
Semirealistic models of the cochlea.
TLDR
The aim of this paper is the introduction and comparison of consistent albeit passive mechanical models for the whole cochlea, and the LS model fares better than the AP model for small damping, whereas the opposite is true for higher damping.
Long short-term memory and Learning-to-learn in networks of spiking neurons
TLDR
This work includes neurons in their RSNN model that reproduce one prominent dynamical process of biological neurons that takes place at the behaviourally relevant time scale of seconds: neuronal adaptation, and denotes these networks as LSNNs because of their Long short-term memory.
TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems
  • 2015
Speech Commands: A Dataset for Limited-Vocabulary Speech Recognition
TLDR
An audio dataset of spoken words designed to help train and evaluate keyword spotting systems and suggests a methodology for reproducible and comparable accuracy metrics for this task.
Automatic differentiation in PyTorch
TLDR
An automatic differentiation module of PyTorch is described — a library designed to enable rapid research on machine learning models that focuses on differentiation of purely imperative programs, with a focus on extensibility and low overhead.
Converting Static Image Datasets to Spiking Neuromorphic Datasets Using Saccades
TLDR
This work proposes a method for converting existing Computer Vision static image datasets into Neuromorphic Vision datasets using an actuated pan-tilt camera platform, and presents conversion of two popular image datasets which have played important roles in the development of Computer Vision.
Adam: A Method for Stochastic Optimization
TLDR
This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
A 128 128 120 dB 15 s Latency Asynchronous Temporal Contrast Vision Sensor
TLDR
This silicon retina provides an attractive combination of characteristics for low-latency dynamic vision under uncontrolled illumination with low post-processing requirements by providing high pixel bandwidth, wide dynamic range, and precisely timed sparse digital output.
Gradient-based learning applied to document recognition
TLDR
This paper reviews various methods applied to handwritten character recognition and compares them on a standard handwritten digit recognition task, and Convolutional neural networks are shown to outperform all other techniques.
Surrogate Gradient Learning in Spiking Neural Networks
TLDR
This article elucidates step-by-step the problems typically encountered when training spiking neural networks, and guides the reader through the key concepts of synaptic plasticity and data-driven learning in the spiking setting.
...
...