# Training Multilayer Spiking Neural Networks using NormAD based Spatio-Temporal Error Backpropagation

@article{Anwani2020TrainingMS, title={Training Multilayer Spiking Neural Networks using NormAD based Spatio-Temporal Error Backpropagation}, author={Navin Anwani and Bipin Rajendran}, journal={ArXiv}, year={2020}, volume={abs/1811.10678} }

## Figures and Tables from this paper

## 7 Citations

Back-Propagation Learning in Deep Spike-By-Spike Networks

- Computer ScienceFrontiers Comput. Neurosci.
- 2019

A learning rule for feed-forward SbS networks is derived that approaches the benchmark results of ANNs without extensive parameter optimization and is envisioned to provide a new basis for research in neuroscience and for technical applications, especially when they become implemented on specialized computational hardware.

Back-propagation learning in deep Spike-By-Spike networks

- Computer SciencebioRxiv
- 2019

A learning rule for hierarchically organized SbS networks is derived, inspired by the error back-propagation algorithms used in feed-forward neural networks, which reaches values achieved by standard deep networks that have a similar structure and also use only simple gradient descent.

Biologically plausible learning in a deep recurrent spiking network

- Computer Science, BiologybioRxiv
- 2019

A framework consisting of mutually coupled local circuits of spiking neurons that meets a fundamental property of the brain and will enable investigations of very large network architectures far beyond current DCNs, including also large scale models of cortex where areas consisting of many local circuits form a complex cyclic network.

Exploring the Effects of Caputo Fractional Derivative in Spiking Neural Network Training

- Computer ScienceElectronics
- 2022

An extensive investigation of performance improvements via a case study of small-scale networks using derivative orders in the unit interval and statistics show that a range of derivative orders can be determined where the Caputo derivative outperforms first-order gradient descent with high confidence.

Memristors—From In‐Memory Computing, Deep Learning Acceleration, and Spiking Neural Networks to the Future of Neuromorphic and Bio‐Inspired Computing

- Computer ScienceAdv. Intell. Syst.
- 2020

The case for a novel beyond‐complementary metal–oxide–semiconductor (CMOS) technology—memristors)—as a potential solution for the implementation of power‐efficient in‐memory computing, DL accelerators, and spiking neural networks is reviewed.

Opportunities for neuromorphic computing algorithms and applications

- Computer ScienceNature Computational Science
- 2022

Spiking neural network dynamic system modeling for computation of quantum annealing and its convergence analysis

- Computer ScienceQuantum Inf. Process.
- 2021

## References

SHOWING 1-10 OF 36 REFERENCES

NormAD - Normalized Approximate Descent based supervised learning rule for spiking neurons

- Computer Science2015 International Joint Conference on Neural Networks (IJCNN)
- 2015

It is shown that NormAD provides faster convergence than state-of-the-art supervised learning algorithms for spiking neurons, often the gain in the rate of convergence being more than a factor of 10.

An Efficient Supervised Training Algorithm for Multilayer Spiking Neural Networks

- Computer SciencePloS one
- 2016

A new training algorithm, the Normalized Spiking Error Back Propagation (NSEBP), which outperforms the traditional SNN multi-layer algorithms in terms of learning efficiency and parameter sensitivity, that are demonstrated by the comprehensive experimental results in this paper.

A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks

- Computer ScienceNeural Networks
- 2013

Training Deep Spiking Neural Networks Using Backpropagation

- Computer ScienceFront. Neurosci.
- 2016

A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials.

Span: Spike Pattern Association Neuron for Learning Spatio-Temporal Spike Patterns

- Computer ScienceInt. J. Neural Syst.
- 2012

SPAN is presented - a spiking neuron that is able to learn associations of arbitrary spike trains in a supervised fashion allowing the processing of spatio-temporal information encoded in the precise timing of spikes.

Supervised learning in multilayer spiking neural networks with inner products of spike trains

- Computer ScienceNeurocomputing
- 2017

Supervised Learning in Multilayer Spiking Neural Networks

- Computer ScienceNeural Computation
- 2013

A supervised learning algorithm for multilayer spiking neural networks that can be applied to neurons firing multiple spikes in artificial neural networks with hidden layers and results in faster convergence than existing algorithms for similar tasks such as SpikeProp.

A supervised learning approach based on STDP and polychronization in spiking neuron networks

- Computer ScienceESANN
- 2007

We propose a novel network model of spiking neurons, without preimposed topology and driven by STDP (Spike-Time-Dependent Plasticity), a temporal Hebbian unsupervised learning mode, based on…

SWAT: A Spiking Neural Network Training Algorithm for Classification Problems

- Computer ScienceIEEE Transactions on Neural Networks
- 2010

A synaptic weight association training (SWAT) algorithm for spiking neural networks (SNNs) that merges the Bienenstock-Cooper-Munro (BCM) learning rule with spike timing dependent plasticity (STDP) and yields a unimodal weight distribution.

Error-backpropagation in temporally encoded networks of spiking neurons

- Computer ScienceNeurocomputing
- 2002