An Introduction to Probabilistic Spiking Neural Networks: Probabilistic Models, Learning Rules, and Applications

@article{Jang2019AnIT,
  title={An Introduction to Probabilistic Spiking Neural Networks: Probabilistic Models, Learning Rules, and Applications},
  author={Hyeryung Jang and Osvaldo Simeone and Brian Gardner and Andr'e Gruning},
  journal={IEEE Signal Processing Magazine},
  year={2019},
  volume={36},
  pages={64-77}
}
Spiking neural networks (SNNs) are distributed trainable systems whose computing elements, or neurons, are characterized by internal analog dynamics and by digital and sparse synaptic communications. The sparsity of the synaptic spiking inputs and the corresponding event-driven nature of neural processing can be leveraged by energy-efficient hardware implementations, which can offer significant energy reductions as compared to conventional artificial neural networks (ANNs). The design of… 
BiSNN: Training Spiking Neural Networks with Binary Weights via Bayesian Learning
TLDR
An SNN model is introduced that combines the benefits of temporally sparse binary activations and of binary weights, and the advantage of the Bayesian paradigm in terms of accuracy and calibration.
Multi-Compartment Variational Online Learning for Spiking Neural Networks
TLDR
It is demonstrated that learning rules based on probabilistic generalized linear neural models can leverage the presence of multiple compartments through modern variational inference based on importance weighting or generalized expectation-maximization.
VOWEL: A Local Online Learning Rule for Recurrent Networks of Probabilistic Spiking Winner- Take-All Circuits
TLDR
A variational online local training rule for WTA-SNNs that leverages only local pre- and post-synaptic information for visible circuits, and an additional common reward signal for hidden circuits is developed, based on probabilistic generalized linear neural models, control variates, and variational regularization.
Supervised Learning With First-to-Spike Decoding in Multilayer Spiking Neural Networks
TLDR
This work proposes a new supervised learning method that can train multilayer spiking neural networks to solve classification problems based on a rapid, first-to-spike decoding strategy, and highlights a novel encoding strategy that can transform image data into compact spatiotemporal patterns for subsequent network processing.
BSNN: Towards Faster and Better Conversion of Artificial Neural Networks to Spiking Neural Networks with Bistable Neurons
TLDR
This paper proposes a novel bistable spiking neural network (BSNN) that addresses the problem of spikes of inactivated neurons (SIN) caused by the phase lead and phase lag and demonstrates state-of-the-art ANN-SNN conversion on challenging datasets.
The Heidelberg spiking datasets for the systematic evaluation of spiking neural networks
TLDR
A general audio-to-spiking conversion procedure is introduced and two novel spike-based classification datasets are provided that show that leveraging spike timing information within these datasets is essential for good classification accuracy.
Learning to Time-Decode in Spiking Neural Networks Through the Information Bottleneck
TLDR
A novel end-to-end learning rule is introduced that optimizes a directed information bottleneck training criterion via surrogate gradients and demonstrates the applicability of the technique in an experimental settings on various tasks, including real-life datasets.
Spiking Generative Adversarial Networks With a Neural Network Discriminator: Local Training, Bayesian Models, and Continual Meta-Learning
TLDR
A novel hybrid architecture comprising a conditional generator, implemented via an SNN, and a discriminator, implemented by a conventional artificial neural network (ANN) is introduced, to better capture multi-modal spatio-temporal distribution.
Multi-Sample Online Learning for Spiking Neural Networks Based on Generalized Expectation Maximization
  • Hyeryung Jang, O. Simeone
  • Computer Science, Engineering
    ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2021
TLDR
This paper proposes to leverage multiple compartments that sample independent spiking signals while sharing synaptic weights to obtain more accurate statistical estimates of the log-likelihood training criterion, as well as of its gradient.
A Brief Review on Spiking Neural Network - A Biological Inspiration
TLDR
A brief introduction to SNN is presented, which incorporates the mathematical structure, applications, and implementation of SNN, which connects neuroscience and machine learning to establish high-level efficient computing.
...
1
2
3
...

References

SHOWING 1-10 OF 43 REFERENCES
An Introduction to Spiking Neural Networks: Probabilistic Models, Learning Rules, and Applications.
TLDR
This paper adopts discrete-time probabilistic models for networked spiking neurons, and it derives supervised and unsupervised learning rules from first principles by using variational inference.
To spike or not to spike: A probabilistic spiking neuron model
  • N. Kasabov
  • Computer Science, Medicine
    Neural Networks
  • 2010
TLDR
A novel probabilistic spiking neuron model (pSNM) is proposed and ways of building pSNN for a wide range of applications including classification, string pattern recognition and associative memory are suggested.
Computing with Spiking Neuron Networks
TLDR
This chapter relates theory of the “spiking neuron” in Section 1 and summarizes the most currently-in-use models of neurons and synaptic plasticity in Section 2, and addresses the computational power and problem of learning in networks of spiking neurons.
Conversion of analog to spiking neural networks using sparse temporal coding
TLDR
This work presents an efficient temporal encoding scheme, where the analog activation of a neuron in the ANN is treated as the instantaneous firing rate given by the time-to-first-spike (TTFS) in the converted SNN.
Surrogate Gradient Learning in Spiking Neural Networks
TLDR
This article elucidates step-by-step the problems typically encountered when training spiking neural networks, and guides the reader through the key concepts of synaptic plasticity and data-driven learning in the spiking setting.
Stochastic variational learning in recurrent spiking networks
TLDR
A new learning rule is derived for recurrent spiking networks with hidden neurons, combining principles from variational learning and reinforcement learning, in the form of a local Spike Timing Dependent Plasticity rule modulated by global factors conveying information about “novelty” on a statistically rigorous ground.
Training Dynamic Exponential Family Models with Causal and Lateral Dependencies for Generalized Neuromorphic Computing
  • Hyeryung Jang, O. Simeone
  • Computer Science, Engineering
    ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2019
TLDR
A probabilistic model is introduced for a generalized set-up in which the synaptic time series can take values in an arbitrary alphabet and are characterized by both causal and instantaneous statistical dependencies.
Deep Spiking Networks
TLDR
It is shown that the spiking Multi-Layer Perceptron behaves identically, during both prediction and training, to a conventional deep network of rectified-linear units, in the limiting case where the network is run for a long time.
Training Deep Spiking Neural Networks Using Backpropagation
TLDR
A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials.
Learning First-to-Spike Policies for Neuromorphic Control Using Policy Gradients
TLDR
Experimental results demonstrate the capability of online trained SNNs as stochastic policies to gracefully trade energy consumption, as measured by the number of spikes, and control performance.
...
1
2
3
4
5
...