Deep Learning in Spiking Neural Networks

@article{Tavanaei2019DeepLI,
  title={Deep Learning in Spiking Neural Networks},
  author={Amirhossein Tavanaei and Masoud Ghodrati and Saeed Reza Kheradpisheh and Timoth{\'e}e Masquelier and A. Maida},
  journal={Neural networks : the official journal of the International Neural Network Society},
  year={2019},
  volume={111},
  pages={
          47-63
        }
}
  • A. Tavanaei, M. Ghodrati, A. Maida
  • Published 22 April 2018
  • Computer Science
  • Neural networks : the official journal of the International Neural Network Society

Figures and Tables from this paper

Spiking Neurons with Differential Evolution Algorithm for Pattern Classification
TLDR
A state-of-the-art manner, differential evolving spiking neural network (DESNN), is proposed for pattern classification, and the experimental results show that the algorithm used in this work applies the fewer neurons and it is effective forpattern classification tasks.
MULTIPLE-TIMESCALE SPIKING RECURRENT NEURAL NETWORKS
TLDR
For sequential and streaming tasks, this work demonstrates how a novel type of adaptive spiking recurrent neural network (SRNN) is able to achieve state-of-the-art performance compared to other spiking neural networks and almost reach or exceed the performance of classical recurrent neural networks (RNNs) while exhibiting sparse activity.
A Tandem Learning Rule for Effective Training and Rapid Inference of Deep Spiking Neural Networks.
TLDR
The proposed tandem learning rule offers a novel solution to training efficient, low latency, and high-accuracy deep SNNs with low computing resources and demonstrates competitive pattern recognition and regression capabilities on both the conventional frame- and event-based vision datasets.
A Tandem Learning Rule for Efficient and Rapid Inference on Deep Spiking Neural Networks.
TLDR
The spike count is considered as the discrete neural representation and design ANN neuronal activation function that can effectively approximate the spike count of the coupled SNN in a tandem learning framework that consists of a SNN and an Artificial Neural Network that share weights.
Deep Learning With Spiking Neurons: Opportunities and Challenges
TLDR
This review addresses the opportunities that deep spiking networks offer and investigates in detail the challenges associated with training SNNs in a way that makes them competitive with conventional deep learning, but simultaneously allows for efficient mapping to hardware.
Back-Propagation Learning in Deep Spike-By-Spike Networks
TLDR
A learning rule for feed-forward SbS networks is derived that approaches the benchmark results of ANNs without extensive parameter optimization and is envisioned to provide a new basis for research in neuroscience and for technical applications, especially when they become implemented on specialized computational hardware.
A Brief Review on Spiking Neural Network - A Biological Inspiration
TLDR
A brief introduction to SNN is presented, which incorporates the mathematical structure, applications, and implementation of SNN, which connects neuroscience and machine learning to establish high-level efficient computing.
Effective and Efficient Computation with Multiple-timescale Spiking Recurrent Neural Networks
TLDR
For sequential and streaming tasks, this work demonstrates how a novel type of adaptive spiking recurrent neural network (SRNN) is able to achieve state-of-the-art performance compared to other spiking neural networks and almost reach or exceed the performance of classical recurrent neural networks (RNNs) while exhibiting sparse activity.
A Hybrid Learning Rule for Efficient and Rapid Inference with Spiking Neural Networks
TLDR
A novel learning rule is proposed based on the hybrid neural network with shared weights, wherein a rate-based SNN is used during the forward propagation to determine precise spike counts and spike trains, and an equivalent ANN is used During error backpropagation to approximate the gradients for the coupled SNN.
...
...

References

SHOWING 1-10 OF 314 REFERENCES
Spiking Neural Networks
TLDR
A state-of-the-art review of the development of spiking neurons and SNNs is presented, and insight into their evolution as the third generation neural networks is provided.
Deep Learning With Spiking Neurons: Opportunities and Challenges
TLDR
This review addresses the opportunities that deep spiking networks offer and investigates in detail the challenges associated with training SNNs in a way that makes them competitive with conventional deep learning, but simultaneously allows for efficient mapping to hardware.
Deep Spiking Networks
TLDR
It is shown that the spiking Multi-Layer Perceptron behaves identically, during both prediction and training, to a conventional deep network of rectified-linear units, in the limiting case where the network is run for a long time.
Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks
TLDR
The results suggest that SNNs can be optimized to dramatically decrease the latency as well as the computation requirements for Deep Neural Networks, making them particularly attractive for applications like robotics, where real-time restrictions to produce outputs and low energy budgets are common.
Spiking Deep Networks with LIF Neurons
TLDR
This work demonstrates that biologically-plausible spiking LIF neurons can be integrated into deep networks can perform as well as other spiking models (e.g. integrate-and-fire), and provides new methods for training deep networks to run on neuromorphic hardware.
STDP-based spiking deep neural networks for object recognition
MT-spike: A multilayer time-based spiking neuromorphic architecture with temporal error backpropagation
TLDR
Simulation results well validate that the algorithmic power of deep multilayer learning can be seamlessly merged with the efficiency of time-based spiking neuromorphic architecture, demonstrating great potentials of “MT-Spike” in resource and power constrained embedded platforms.
STDP-based spiking deep convolutional neural networks for object recognition
TLDR
The results suggest that the combination of STDP with latency coding may be a key to understanding the way that the primate visual system learns, its remarkable processing speed and its low energy consumption.
Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware
TLDR
Surprisingly, it is found that short synaptic delays are sufficient to implement the dynamic (temporal) aspect of the RNN in the question classification task and the discretization of the neural activities is beneficial to the train-and-constrain approach.
Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing
TLDR
The method for converting an ANN into an SNN enables low-latency classification with high accuracies already after the first output spike, and compared with previous SNN approaches it yields improved performance without increased training time.
...
...