• Corpus ID: 43943457

Spiking Linear Dynamical Systems on Neuromorphic Hardware for Low-Power Brain-Machine Interfaces

@article{Clark2018SpikingLD,
  title={Spiking Linear Dynamical Systems on Neuromorphic Hardware for Low-Power Brain-Machine Interfaces},
  author={David G. Clark and Jesse A. Livezey and Edward F. Chang and Kristofer E. Bouchard},
  journal={ArXiv},
  year={2018},
  volume={abs/1805.08889}
}
Neuromorphic architectures achieve low-power operation by using many simple spiking neurons in lieu of traditional hardware. Here, we develop methods for precise linear computations in spiking neural networks and use these methods to map the evolution of a linear dynamical system (LDS) onto an existing neuromorphic chip: IBM's TrueNorth. We analytically characterize, and numerically validate, the discrepancy between the spiking LDS state sequence and that of its non-spiking counterpart. These… 

Figures from this paper

References

SHOWING 1-10 OF 25 REFERENCES

Training Spiking Deep Networks for Neuromorphic Hardware

We describe a method to train spiking deep networks that can be run using leaky integrate-and-fire (LIF) neurons, achieving state-of-the-art results for spiking LIF networks on five datasets,

Design and validation of a real-time spiking-neural-network decoder for brain-machine interfaces.

These results demonstrate the tractability of SNN implementations of statistical signal processing algorithms on different monkeys and for several tasks, suggesting that a SNN decoder, implemented on a neuromorphic chip, may be a feasible computational platform for low-power fully-implanted prostheses.

Neuromorphic Kalman filter implementation in IBM’s TrueNorth

This work details the first instance of a Kalman filter implementation in IBM’s neuromorphic architecture, TrueNorth, for both parallel and serial spike trains, and the limits of the implementation are explored whilst varying the size of weight and threshold registers.

Backpropagation for Energy-Efficient Neuromorphic Computing

This work treats spikes and discrete synapses as continuous probabilities, which allows training the network using standard backpropagation and naturally maps to neuromorphic hardware by sampling the probabilities to create one or more networks, which are merged using ensemble averaging.

SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks

SuperSpike is derived, a nonlinear voltage-based three-factor learning rule capable of training multilayer networks of deterministic integrate-and-fire neurons to perform nonlinear computations on spatiotemporal spike patterns.

Convolutional networks for fast, energy-efficient neuromorphic computing

This approach allows the algorithmic power of deep learning to be merged with the efficiency of neuromorphic processors, bringing the promise of embedded, intelligent, brain-inspired computing one step closer.

A Survey of Neuromorphic Computing and Neural Networks in Hardware

An exhaustive review of the research conducted in neuromorphic computing since the inception of the term is provided to motivate further work by illuminating gaps in the field where new research is needed.

A million spiking-neuron integrated circuit with a scalable communication network and interface

Inspired by the brain’s structure, an efficient, scalable, and flexible non–von Neumann architecture is developed that leverages contemporary silicon technology and is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification.

A recurrent neural network for closed-loop intracortical brain-machine interface decoders.

The ability of a simplified type of RNN, one with limited modifications to the internal weights called an echostate network (ESN), to effectively and continuously decode monkey reaches during a standard center-out reach task using a cortical brain–machine interface (BMI) in a closed loop is explored.

Supervised learning in spiking neural networks with FORCE training

The direct applicability of the FORCE method to spiking neural networks is demonstrated and it is demonstrated that these networks can be trained to exhibit different dynamic behaviours.