# Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

@article{Maass2002RealTimeCW, title={Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations}, author={Wolfgang Maass and Thomas Natschl{\"a}ger and Henry Markram}, journal={Neural Computation}, year={2002}, volume={14}, pages={2531-2560} }

A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based…

## 2,886 Citations

P-CRITICAL: a reservoir autoregulation plasticity rule for neuromorphic hardware

- Computer ScienceNeuromorphic Computing and Engineering
- 2022

A new local and unsupervised plasticity rule named P-CRITICAL designed for automatic reservoir tuning that translates well to physical and digital neuromorphic processors is proposed.

Model for a flexible motor memory based on a self-active recurrent neural network.

- Computer ScienceHuman movement science
- 2013

Memory in linear recurrent neural networks in continuous time

- Computer ScienceNeural Networks
- 2010

On the computational power of circuits of spiking neurons

- Computer Science, BiologyJ. Comput. Syst. Sci.
- 2004

Minimal approach to neuro-inspired information processing

- Computer ScienceFront. Comput. Neurosci.
- 2015

By reducing the neuro-inspired reservoir computing approach to its bare essentials, it is found that nonlinear transient responses of the simple dynamical system enable the processing of information with excellent performance and at unprecedented speed.

Learning Universal Computations with Spikes

- Computer Science, BiologyPLoS Comput. Biol.
- 2016

This work derives constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing and shows how spiking networks can build models of external world systems and use the acquired knowledge to control them.

Connectivity, Dynamics, and Memory in Reservoir Computing with Binary and Analog Neurons

- Computer ScienceNeural Computation
- 2010

Investigating the influence of the network connectivity (parameterized by the neuron in-degree) on a family of network models that interpolates between analog and binary networks reveals that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, leading to decreased computational performance observed in binary circuits that are densely connected.

Fading memory and kernel properties of generic cortical microcircuit models

- Computer ScienceJournal of Physiology-Paris
- 2004

On Computational Power and the Order-Chaos Phase Transition in Reservoir Computing

- Computer ScienceNIPS
- 2008

This article analyses based amongst others on the Lyapunov exponent reveal that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, which explains the observed decreased computational performance ofbinary circuits of high node in-degree.

The unified Reservoir Computing concept and its digital hardware implementations

- Computer Science
- 2006

Three elegant solutions to the training problem of a RNN, while still being able to use its powerful temporal processing capabilities, by using the – randomly chosen – network as a reservoir whose weights are not changed during the training phase.

## References

SHOWING 1-10 OF 46 REFERENCES

Neural Systems as Nonlinear Filters

- BiologyNeural Computation
- 2000

This article gives a complete mathematical characterization of all filters that can be approximated by feedforward neural networks with dynamic synapses, and provides for all nonlinear filters that are approximable by Volterra series a new complexity hierarchy related to the cost of implementing such filters in neural systems.

Perspectives of the high-dimensional dynamics of neural microcircuits from the point of view of low-dimensional readouts

- Computer ScienceComplex.
- 2003

It is demonstrated that pairs of readout neurons can transform the complex trajectory of transient states of a large neural circuit into a simple and clearly structured two-dimensional trajectory.

Analog computation via neural networks

- Computer Science[1993] The 2nd Israel Symposium on Theory and Computing Systems
- 1993

The authors pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research, which exhibit at least some robustness with respect to noise and implementation errors.

Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks

- Computer ScienceNeural Computation
- 1992

It is shown that a recurrent, second-order neural network using a real-time, forward training algorithm readily learns to infer small regular grammars from positive and negative string training samples, and many of the neural net state machines are dynamically stable, that is, they correctly classify many long unseen strings.

What is a moment? Transient synchrony as a collective mechanism for spatiotemporal integration.

- BiologyProceedings of the National Academy of Sciences of the United States of America
- 2001

The principles behind the operation of a network of simple integrate-and-fire neurons that contained output neurons selective for specific spatiotemporal patterns of inputs are presented and it is shown how the recognition is invariant to uniform time warp and uniform intensity change of the input events.

Gradient calculations for dynamic recurrent neural networks: a survey

- Computer ScienceIEEE Trans. Neural Networks
- 1995

The author discusses advantages and disadvantages of temporally continuous neural networks in contrast to clocked ones and presents some "tricks of the trade" for training, using, and simulating continuous time and recurrent neural networks.

Temporal information transformed into a spatial code by a neural network with realistic properties

- BiologyScience
- 1995

It is demonstrated that known time-dependent neuronal properties enable a network to transform temporal information into a spatial code in a self-organizing manner, with no need to assume a spectrum of time delays or to custom-design the circuit.

On the Computational Power of Winner-Take-All

- Computer ScienceNeural Computation
- 2000

The theoretical analysis shows that winner- take-all is a surprisingly powerful computational module in comparison with threshold gates (also referred to as McCulloch-Pitts neurons) and sigmoidal gates, and proves an optimal quadratic lower bound for computing winner-takeall in any feedforward circuit consisting of threshold gates.

Lower Bounds for the Computational Power of Networks of Spiking Neurons

- Computer ScienceNeural Computation
- 1996

It is shown that simple operations on phase differences between spike-trains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network of spiking neurons.

t Synchrony Generation in Recurrent Networks with Frequency-Dependent Synapses

- BiologyThe Journal of Neuroscience
- 2000

It is found that the particular intensities of the external stimulus to specific neurons were crucial to evoke population bursts and therefore depends on the spectrum of basal discharge rates across the population and not on the anatomical individuality of the neurons, because this was random.