NormAD - Normalized Approximate Descent based supervised learning rule for spiking neurons

@article{Anwani2015NormADN,
  title={NormAD - Normalized Approximate Descent based supervised learning rule for spiking neurons},
  author={Navin Anwani and Bipin Rajendran},
  journal={2015 International Joint Conference on Neural Networks (IJCNN)},
  year={2015},
  pages={1-8}
}
  • Navin Anwani, B. Rajendran
  • Published 12 July 2015
  • Computer Science
  • 2015 International Joint Conference on Neural Networks (IJCNN)
NormAD is a novel supervised learning algorithm to train spiking neurons to produce a desired spike train in response to a given input. [] Key Method A variant of stochastic gradient descent along with normalization has been used to derive the synaptic weight update rule. NormAD uses leaky integration of the input to determine the synaptic weight change. Since leaky integration is fundamental to all integrate-and-fire models of spiking neurons, we claim universal applicability of the learning rule to other…
Deep Learning in Spiking Neural Networks
Optimization of Output Spike Train Encoding for a Spiking Neuron Based on its Spatio–Temporal Input Pattern
TLDR
A method is proposed to adaptively adjust an initial suboptimal output encoding during different learning epochs to find the optimal output encoding and this method increases the accuracy in a classification task from 90% to 100%.
Unsupervised regenerative learning of hierarchical features in Spiking Deep Networks for object recognition
  • P. Panda, K. Roy
  • Computer Science
    2016 International Joint Conference on Neural Networks (IJCNN)
  • 2016
TLDR
A spike-based unsupervised regenerative learning scheme to train Spiking Deep Networks (SpikeCNN) for object recognition problems using biologically realistic leaky integrate-and-fire neurons resulting in computationally efficient learning is presented.
Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks
TLDR
This article elucidates step-by-step the problems typically encountered when training SNNs and guides the reader through the key concepts of synaptic plasticity and data-driven learning in the spiking setting as well as introducing surrogate gradient methods, specifically, as a particularly flexible and efficient method to overcome the aforementioned challenges.
Training Probabilistic Spiking Neural Networks with First- To-Spike Decoding
TLDR
A novel training method is proposed here for a first-to-spike decoding rule, whereby the SNN can perform an early classification decision once spike firing is detected at an output neuron.
Tuning Convolutional Spiking Neural Network with Biologically-plausible Reward Propagation
TLDR
The introduction of biologically plausible learning rules to the training procedure of biologically realistic SNNs will give more hints and inspiration toward a better understanding of the biological system's intelligent nature.
Deep Networks Incorporating Spiking Neural Dynamics
TLDR
An alternative perspective on the spiking neuron as a particular ANN construct called Spiking Neural Unit (SNU) is proposed, which provides a systematic methodology for implementing and training deep networks incorporating spiking dynamics that achieve accuracies as high, or better than, state-of-the-art ANNs.
...
...

References

SHOWING 1-10 OF 15 REFERENCES
A New Supervised Learning Algorithm for Spiking Neurons
TLDR
A new supervised learning method for spiking neurons with temporal encoding is proposed, which first transforms the supervised learning into a classification problem and then solves the problem by using the perceptron learning rule.
Span: Spike Pattern Association Neuron for Learning Spatio-Temporal Spike Patterns
TLDR
SPAN is presented - a spiking neuron that is able to learn associations of arbitrary spike trains in a supervised fashion allowing the processing of spatio-temporal information encoded in the precise timing of spikes.
Supervised Learning in Spiking Neural Networks with ReSuMe: Sequence Learning, Classification, and Spike Shifting
TLDR
A model of supervised learning for biologically plausible neurons is presented that enables spiking neurons to reproduce arbitrary template spike patterns in response to given synaptic stimuli even in the presence of various sources of noise and shows that the learning rule can also be used for decision-making tasks.
A supervised learning approach based on STDP and polychronization in spiking neuron networks
We propose a novel network model of spiking neurons, without preimposed topology and driven by STDP (Spike-Time-Dependent Plasticity), a temporal Hebbian unsupervised learning mode, based on
Error-backpropagation in temporally encoded networks of spiking neurons
Precise-Spike-Driven Synaptic Plasticity: Learning Hetero-Association of Spatiotemporal Spike Patterns
TLDR
Experimental results show that the PSD rule is capable of spatiotemporal pattern classification, and can even outperform a well studied benchmark algorithm with the proposed relative confidence criterion.
The Chronotron: A Neuron That Learns to Fire Temporally Precise Spike Patterns
TLDR
This work introduces two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons), one that provides high memory capacity (E-learning), and one that has a higher biological plausibility (I-learning).
Adaptive exponential integrate-and-fire model as an effective description of neuronal activity.
TLDR
The authors' simple model predicts correctly the timing of 96% of the spikes of the detailed model in response to injection of noisy synaptic conductances and has enough expressive power to reproduce qualitatively several electrophysiological classes described in vitro.
Networks of Spiking Neurons: The Third Generation of Neural Network Models
  • W. Maass
  • Computer Science
    Electron. Colloquium Comput. Complex.
  • 1996
...
...