Corpus ID: 599250

Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks

@article{Zambrano2016FastAE,
  title={Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks},
  author={Davide Zambrano and S. Boht{\'e}},
  journal={ArXiv},
  year={2016},
  volume={abs/1609.02053}
}
textabstractBiological neurons communicate with a sparing exchange of pulses - spikes. [...] Key ResultAdditionally, in a streaming setting where frames are continuously classified, we show that the ASNN requires substantially fewer network updates as compared to the corresponding ANN.Expand
Sparse Computation in Adaptive Spiking Neural Networks
TLDR
Adaptive spike-based coding based on the firing rate limiting adaptation phenomenon observed in biological spiking neurons allows for the dynamic control of neural coding precision and holds promise as a novel and sparsely active model for neural computation that naturally fits to temporally continuous and asynchronous applications. Expand
Optimization of Adaptive Spiking Neural Networks on GPU
Adaptive Spiking Neural Networks (ASNN’s) have shown to outperform state-of-the-art Spiking Neural Networks. An Adaptive Spiking Neuron (ASN) can effectively be used as a drop-in replacement for aExpand
Gating Sensory Noise in a Spiking Subtractive LSTM
TLDR
This work designs an analog Long Short-Term Memory (LSTM) cell where its neurons can be substituted with efficient spiking neurons, where subtractive gating is used instead of multiplicative gating, resulting in memory networks that compute very efficiently, with low average firing rates comparable to those in biological neurons, while operating in continuous time. Expand
MULTIPLE-TIMESCALE SPIKING RECURRENT NEURAL NETWORKS
The emergence of brain-inspired neuromorphic computing as a paradigm for edge AI is motivating the search for high-performance and efficient spiking neural networks to run on this hardware. However,Expand
Training a Spiking Neural Network with Equilibrium Propagation
TLDR
It is shown that with appropriate step-size annealing, the Equilibrium Propagation model can converge to the same fixed-point as a real-valued neural network, and that with predictive coding, it can make this convergence much faster. Expand
Deep Learning With Spiking Neurons: Opportunities and Challenges
TLDR
This review addresses the opportunities that deep spiking networks offer and investigates in detail the challenges associated with training SNNs in a way that makes them competitive with conventional deep learning, but simultaneously allows for efficient mapping to hardware. Expand
Self-Evolutionary Neuron Model for Fast-Response Spiking Neural Networks
TLDR
Experimental results of spiking feedforward neural network and spiking convolutional neural network on both MNIST handwritten digits and Fashion-MNIST classification tasks showed that compared with the plain SNNs, the proposed neuron models based networks significantly accelerate the response speed to input signal. Expand
Training a Network of Spiking Neurons with Equilibrium Propagation
TLDR
It is shown that with appropriate step-size annealing, the Equilibrium Propagation model can converge to the same fixed-point as a real-valued neural network, and that with predictive coding, it can make this convergence much faster. Expand
Conversion of analog to spiking neural networks using sparse temporal coding
TLDR
This work presents an efficient temporal encoding scheme, where the analog activation of a neuron in the ANN is treated as the instantaneous firing rate given by the time-to-first-spike (TTFS) in the converted SNN. Expand
Spiking Neural Networks Hardware Implementations and Challenges
TLDR
This survey presents the state of the art of hardware implementations of spiking neural networks and the current trends in algorithm elaboration from model selection to training mechanisms and describes the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level. Expand
...
1
2
3
4
...

References

SHOWING 1-10 OF 27 REFERENCES
Networks of Spiking Neurons: The Third Generation of Neural Network Models
  • W. Maass
  • Computer Science
  • Electron. Colloquium Comput. Complex.
  • 1996
TLDR
It is shown that networks of spiking neurons are, with regard to the number of neurons that are needed, computationally more powerful than these other neural network models based on McCulloch Pitts neurons, respectively, sigmoidal gates. Expand
Fractionally Predictive Spiking Neurons
TLDR
It is shown that the actual neural spike-train itself can be considered as the fractional derivative, provided that the neural signal is approximated by a sum of power-law kernels. Expand
Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks
TLDR
The results suggest that SNNs can be optimized to dramatically decrease the latency as well as the computation requirements for Deep Neural Networks, making them particularly attractive for applications like robotics, where real-time restrictions to produce outputs and low energy budgets are common. Expand
Spike-Based Population Coding and Working Memory
TLDR
It is proposed that probability distributions are inferred spike-per-spike in recurrently connected networks of integrate-and-fire neurons, which can combine sensory cues optimally, track the state of a time-varying stimulus and memorize accumulated evidence over periods much longer than the time constant of single neurons. Expand
Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing
TLDR
The method for converting an ANN into an SNN enables low-latency classification with high accuracies already after the first output spike, and compared with previous SNN approaches it yields improved performance without increased training time. Expand
Efficient Spike-Coding with Multiplicative Adaptation in a Spike Response Model
TLDR
A multiplicative adaptive Spike Response Model where the spike-triggered adaptation dynamics are scaled multiplicatively by the adaptation state at the time of spiking achieves a high coding efficiency and maintains this efficiency over changes in the dynamic signal range of several orders of magnitude, without changing model parameters. Expand
Streaming parallel GPU acceleration of large-scale filter-based spiking neural networks
TLDR
This work shows that for filter-based spiking neurons, like the Spike Response Model, the additive nature of membrane potential dynamics enables additional update parallelism, and shows that optimizing simulation algorithms and data structures to the GPU's architecture has a large pay-off. Expand
Convolutional networks for fast, energy-efficient neuromorphic computing
TLDR
This approach allows the algorithmic power of deep learning to be merged with the efficiency of neuromorphic processors, bringing the promise of embedded, intelligent, brain-inspired computing one step closer. Expand
LIF and Simplified SRM Neurons Encode Signals Into Spikes via a Form of Asynchronous Pulse Sigma–Delta Modulation
  • Young-Chul Yoon
  • Computer Science, Medicine
  • IEEE Transactions on Neural Networks and Learning Systems
  • 2017
TLDR
It is shown how two spiking neuron models encode continuous-time signals into spikes using a special form of sigma–delta modulation (SDM), which will facilitate the design of spiking neurons and spiking neural networks as well as cross fertilizations between the fields of neural coding and the SDM. Expand
A million spiking-neuron integrated circuit with a scalable communication network and interface
TLDR
Inspired by the brain’s structure, an efficient, scalable, and flexible non–von Neumann architecture is developed that leverages contemporary silicon technology and is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification. Expand
...
1
2
3
...