# Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks

@article{Zambrano2016FastAE, title={Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks}, author={Davide Zambrano and S. Boht{\'e}}, journal={ArXiv}, year={2016}, volume={abs/1609.02053} }

#### Figures, Tables, and Topics from this paper

#### 37 Citations

Sparse Computation in Adaptive Spiking Neural Networks

- Computer Science, Medicine
- Front. Neurosci.
- 2019

Adaptive spike-based coding based on the firing rate limiting adaptation phenomenon observed in biological spiking neurons allows for the dynamic control of neural coding precision and holds promise as a novel and sparsely active model for neural computation that naturally fits to temporally continuous and asynchronous applications. Expand

Optimization of Adaptive Spiking Neural Networks on GPU

- 2017

Adaptive Spiking Neural Networks (ASNN’s) have shown to outperform state-of-the-art Spiking Neural Networks. An Adaptive Spiking Neuron (ASN) can effectively be used as a drop-in replacement for a… Expand

Gating Sensory Noise in a Spiking Subtractive LSTM

- Computer Science
- ICANN
- 2018

This work designs an analog Long Short-Term Memory (LSTM) cell where its neurons can be substituted with efficient spiking neurons, where subtractive gating is used instead of multiplicative gating, resulting in memory networks that compute very efficiently, with low average firing rates comparable to those in biological neurons, while operating in continuous time. Expand

MULTIPLE-TIMESCALE SPIKING RECURRENT NEURAL NETWORKS

- 2020

The emergence of brain-inspired neuromorphic computing as a paradigm for edge AI is motivating the search for high-performance and efficient spiking neural networks to run on this hardware. However,… Expand

Training a Spiking Neural Network with Equilibrium Propagation

- Computer Science
- AISTATS
- 2019

It is shown that with appropriate step-size annealing, the Equilibrium Propagation model can converge to the same fixed-point as a real-valued neural network, and that with predictive coding, it can make this convergence much faster. Expand

Deep Learning With Spiking Neurons: Opportunities and Challenges

- Medicine
- Front. Neurosci.
- 2018

This review addresses the opportunities that deep spiking networks offer and investigates in detail the challenges associated with training SNNs in a way that makes them competitive with conventional deep learning, but simultaneously allows for efficient mapping to hardware. Expand

Self-Evolutionary Neuron Model for Fast-Response Spiking Neural Networks

- Computer Science
- 2019

Experimental results of spiking feedforward neural network and spiking convolutional neural network on both MNIST handwritten digits and Fashion-MNIST classification tasks showed that compared with the plain SNNs, the proposed neuron models based networks significantly accelerate the response speed to input signal. Expand

Training a Network of Spiking Neurons with Equilibrium Propagation

- Computer Science
- 2018

It is shown that with appropriate step-size annealing, the Equilibrium Propagation model can converge to the same fixed-point as a real-valued neural network, and that with predictive coding, it can make this convergence much faster. Expand

Conversion of analog to spiking neural networks using sparse temporal coding

- Computer Science
- 2018 IEEE International Symposium on Circuits and Systems (ISCAS)
- 2018

This work presents an efficient temporal encoding scheme, where the analog activation of a neuron in the ANN is treated as the instantaneous firing rate given by the time-to-first-spike (TTFS) in the converted SNN. Expand

Spiking Neural Networks Hardware Implementations and Challenges

- Computer Science
- ACM J. Emerg. Technol. Comput. Syst.
- 2019

This survey presents the state of the art of hardware implementations of spiking neural networks and the current trends in algorithm elaboration from model selection to training mechanisms and describes the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level. Expand

#### References

SHOWING 1-10 OF 27 REFERENCES

Networks of Spiking Neurons: The Third Generation of Neural Network Models

- Computer Science
- Electron. Colloquium Comput. Complex.
- 1996

It is shown that networks of spiking neurons are, with regard to the number of neurons that are needed, computationally more powerful than these other neural network models based on McCulloch Pitts neurons, respectively, sigmoidal gates. Expand

Fractionally Predictive Spiking Neurons

- Computer Science, Mathematics
- NIPS
- 2010

It is shown that the actual neural spike-train itself can be considered as the fractional derivative, provided that the neural signal is approximated by a sum of power-law kernels. Expand

Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks

- Computer Science
- SAC
- 2016

The results suggest that SNNs can be optimized to dramatically decrease the latency as well as the computation requirements for Deep Neural Networks, making them particularly attractive for applications like robotics, where real-time restrictions to produce outputs and low energy budgets are common. Expand

Spike-Based Population Coding and Working Memory

- Computer Science, Medicine
- PLoS Comput. Biol.
- 2011

It is proposed that probability distributions are inferred spike-per-spike in recurrently connected networks of integrate-and-fire neurons, which can combine sensory cues optimally, track the state of a time-varying stimulus and memorize accumulated evidence over periods much longer than the time constant of single neurons. Expand

Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing

- Computer Science
- 2015 International Joint Conference on Neural Networks (IJCNN)
- 2015

The method for converting an ANN into an SNN enables low-latency classification with high accuracies already after the first output spike, and compared with previous SNN approaches it yields improved performance without increased training time. Expand

Efficient Spike-Coding with Multiplicative Adaptation in a Spike Response Model

- Computer Science
- NIPS
- 2012

A multiplicative adaptive Spike Response Model where the spike-triggered adaptation dynamics are scaled multiplicatively by the adaptation state at the time of spiking achieves a high coding efficiency and maintains this efficiency over changes in the dynamic signal range of several orders of magnitude, without changing model parameters. Expand

Streaming parallel GPU acceleration of large-scale filter-based spiking neural networks

- Computer Science, Medicine
- Network
- 2012

This work shows that for filter-based spiking neurons, like the Spike Response Model, the additive nature of membrane potential dynamics enables additional update parallelism, and shows that optimizing simulation algorithms and data structures to the GPU's architecture has a large pay-off. Expand

Convolutional networks for fast, energy-efficient neuromorphic computing

- Computer Science, Medicine
- Proceedings of the National Academy of Sciences
- 2016

This approach allows the algorithmic power of deep learning to be merged with the efficiency of neuromorphic processors, bringing the promise of embedded, intelligent, brain-inspired computing one step closer. Expand

LIF and Simplified SRM Neurons Encode Signals Into Spikes via a Form of Asynchronous Pulse Sigma–Delta Modulation

- Computer Science, Medicine
- IEEE Transactions on Neural Networks and Learning Systems
- 2017

It is shown how two spiking neuron models encode continuous-time signals into spikes using a special form of sigma–delta modulation (SDM), which will facilitate the design of spiking neurons and spiking neural networks as well as cross fertilizations between the fields of neural coding and the SDM. Expand

A million spiking-neuron integrated circuit with a scalable communication network and interface

- Computer Science, Medicine
- Science
- 2014

Inspired by the brain’s structure, an efficient, scalable, and flexible non–von Neumann architecture is developed that leverages contemporary silicon technology and is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification. Expand