Corpus ID: 59336289

Surrogate Gradient Learning in Spiking Neural Networks

@article{Neftci2019SurrogateGL,
  title={Surrogate Gradient Learning in Spiking Neural Networks},
  author={Emre O. Neftci and Hesham Mostafa and Friedemann Zenke},
  journal={ArXiv},
  year={2019},
  volume={abs/1901.09948}
}
Spiking neural networks are nature's versatile solution to fault-tolerant and energy efficient signal processing. To translate these benefits into hardware, a growing number of neuromorphic spiking neural network processors attempt to emulate biological neural networks. These developments have created an imminent need for methods and tools to enable such systems to solve real-world signal processing problems. Like conventional neural networks, spiking neural networks can be trained on real… Expand
An Introduction to Probabilistic Spiking Neural Networks.
TLDR
This article adopts discrete-time probabilistic models for networked spiking neurons and derive supervised and unsupervised learning rules from first principles via variational inference that enables the direct derivation of learning rules by leveraging the unique time-encoding capabilities of SNNs. Expand
An Introduction to Spiking Neural Networks: Probabilistic Models, Learning Rules, and Applications.
TLDR
This paper adopts discrete-time probabilistic models for networked spiking neurons, and it derives supervised and unsupervised learning rules from first principles by using variational inference. Expand
MULTIPLE-TIMESCALE SPIKING RECURRENT NEURAL NETWORKS
The emergence of brain-inspired neuromorphic computing as a paradigm for edge AI is motivating the search for high-performance and efficient spiking neural networks to run on this hardware. However,Expand
Training Deep Spiking Neural Networks for Energy-Efficient Neuromorphic Computing
TLDR
This paper presents biologically plausible Spike Timing Dependent Plasticity based deterministic and stochastic algorithms for unsupervised representation learning in SNNs, and proposes conversion methodology to map off-the-shelf trained ANN to SNN for energy-efficient inference. Expand
Explicitly Trained Spiking Sparsity in Spiking Neural Networks with Backpropagation
TLDR
This work proposes an explicit inclusion of spike counts in the loss function, along with a traditional error loss, causing the backpropagation learning algorithms to optimize weight parameters for both accuracy and spiking sparsity. Expand
An Introduction to Probabilistic Spiking Neural Networks: Probabilistic Models, Learning Rules, and Applications
TLDR
This work has shown that the sparsity of the synaptic spiking inputs and the corresponding event-driven nature of neural processing can be leveraged by energy-efficient hardware implementations, which can offer significant energy reductions as compared to conventional artificial neural networks. Expand
A Tandem Learning Rule for Efficient and Rapid Inference on Deep Spiking Neural Networks.
TLDR
The spike count is considered as the discrete neural representation and design ANN neuronal activation function that can effectively approximate the spike count of the coupled SNN in a tandem learning framework that consists of a SNN and an Artificial Neural Network that share weights. Expand
Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation
TLDR
The proposed training methodology converges in less than 20 epochs of spike-based backpropagation for most standard image classification datasets, thereby greatly reducing the training complexity compared to training SNNs from scratch. Expand
Minibatch Processing in Spiking Neural Networks
TLDR
To the knowledge, this is the first general-purpose implementation of mini-batch processing in a spiking neural networks simulator, which works with arbitrary neuron and synapse models and shows the effectiveness of large batch sizes in two SNN application domains. Expand
Stochasticity and Robustness in Spiking Neural Networks
TLDR
It is demonstrated that noise can be used to make the behavior of IF neurons more robust to synaptic inaccuracy, and it is shown that a noisy network can tolerate the inaccuracy expected when hafnium-oxide based resistive random-access memory is used to encode synaptic weights. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 42 REFERENCES
Deep Networks Incorporating Spiking Neural Dynamics
TLDR
An alternative perspective on the spiking neuron as a particular ANN construct called Spiking Neural Unit (SNU) is proposed, which provides a systematic methodology for implementing and training deep networks incorporating spiking dynamics that achieve accuracies as high, or better than, state-of-the-art ANNs. Expand
Deep Learning in Spiking Neural Networks
TLDR
The emerging picture is that SNNs still lag behind ANNs in terms of accuracy, but the gap is decreasing, and can even vanish on some tasks, while SNN's typically require many fewer operations and are the better candidates to process spatio-temporal data. Expand
Gradient Descent for Spiking Neural Networks
TLDR
A gradient descent method for optimizing spiking network models by introducing a differentiable formulation of spiking networks and deriving the exact gradient calculation offers a general purpose supervised learning algorithm for spiking neural networks, thus advancing further investigations on spike-based computation. Expand
Supervised Learning Based on Temporal Coding in Spiking Neural Networks
  • H. Mostafa
  • Computer Science, Medicine
  • IEEE Transactions on Neural Networks and Learning Systems
  • 2018
TLDR
This work shows that in a feedforward spiking network that uses a temporal coding scheme where information is encoded in spike times instead of spike rates, the network input–output relation is differentiable almost everywhere and this relation is piecewise linear after a transformation of variables. Expand
Training Deep Spiking Neural Networks Using Backpropagation
TLDR
A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials. Expand
Spiking Deep Networks with LIF Neurons
TLDR
This work demonstrates that biologically-plausible spiking LIF neurons can be integrated into deep networks can perform as well as other spiking models (e.g. integrate-and-fire), and provides new methods for training deep networks to run on neuromorphic hardware. Expand
SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks
TLDR
SuperSpike is derived, a nonlinear voltage-based three-factor learning rule capable of training multilayer networks of deterministic integrate-and-fire neurons to perform nonlinear computations on spatiotemporal spike patterns. Expand
Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets
TLDR
It is shown that an online merging of locally available information during a computation with suitable top-down learning signals in real-time provides highly capable approximations to back-propagation through time (BPTT). Expand
Supervised learning in spiking neural networks with FORCE training
TLDR
The direct applicability of the FORCE method to spiking neural networks is demonstrated and it is demonstrated that these networks can be trained to exhibit different dynamic behaviours. Expand
Synaptic Plasticity Dynamics for Deep Continuous Local Learning (DECOLLE)
TLDR
DECOLLE networks provide continuously learning machines that are relevant to biology and supportive of event-based, low-power computer vision architectures matching the accuracies of conventional computers on tasks where temporal precision and speed are essential. Expand
...
1
2
3
4
5
...