Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

@article{Maass2002RealTimeCW,
  title={Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations},
  author={Wolfgang Maass and Thomas Natschl{\"a}ger and Henry Markram},
  journal={Neural Computation},
  year={2002},
  volume={14},
  pages={2531-2560}
}
A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based… Expand
Principles of real-time computing with feedback applied to cortical microcircuit models
TLDR
A computational theory is presented that characterizes the gain in computational power achieved through feedback in dynamical systems with fading memory and implies that many such systems acquire through feedback universal computational capabilities for analog computing with a non-fading memory. Expand
P-CRITICAL: A Reservoir Autoregulation Plasticity Rule for Neuromorphic Hardware
TLDR
This paper proposes a new local plasticity rule named P-CRITICAL designed for automatic reservoir tuning that translates well to Intel's Loihi research chip, a recent neuromorphic processor. Expand
Memory in linear recurrent neural networks in continuous time
TLDR
This work develops an analytical model which allows the calculation of the memory function for continuous time linear dynamical systems, which can be considered as networks of linear leaky integrator neurons, and research memory properties for different types of reservoir. Expand
Model for a flexible motor memory based on a self-active recurrent neural network.
TLDR
The model involves the concept of "neural outsourcing" which amounts to the permanent shifting of computational load from higher to lower-level neural structures, which might help to explain why humans are able to execute learned skills in a fluent and flexible manner without the need for attention to the details of the movement. Expand
On the computational power of circuits of spiking neurons
TLDR
This article begins a rigorous mathematical analysis of the real-time computing capabilities of a new generation of models for neural computation, liquid state machines, that can be implemented with--in fact benefit from--diverse computational units. Expand
Minimal approach to neuro-inspired information processing
TLDR
By reducing the neuro-inspired reservoir computing approach to its bare essentials, it is found that nonlinear transient responses of the simple dynamical system enable the processing of information with excellent performance and at unprecedented speed. Expand
Learning Universal Computations with Spikes
TLDR
This work derives constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing and shows how spiking networks can build models of external world systems and use the acquired knowledge to control them. Expand
Connectivity, Dynamics, and Memory in Reservoir Computing with Binary and Analog Neurons
TLDR
Investigating the influence of the network connectivity (parameterized by the neuron in-degree) on a family of network models that interpolates between analog and binary networks reveals that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, leading to decreased computational performance observed in binary circuits that are densely connected. Expand
Fading memory and kernel properties of generic cortical microcircuit models
TLDR
This article proposes to analyze circuits of spiking neurons in terms of their roles as analog fading memory and non-linear kernels, rather than as implementations of specific computational operations and algorithms. Expand
On Computational Power and the Order-Chaos Phase Transition in Reservoir Computing
TLDR
This article analyses based amongst others on the Lyapunov exponent reveal that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, which explains the observed decreased computational performance ofbinary circuits of high node in-degree. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 43 REFERENCES
Neural Systems as Nonlinear Filters
TLDR
This article gives a complete mathematical characterization of all filters that can be approximated by feedforward neural networks with dynamic synapses, and provides for all nonlinear filters that are approximable by Volterra series a new complexity hierarchy related to the cost of implementing such filters in neural systems. Expand
Perspectives of the high-dimensional dynamics of neural microcircuits from the point of view of low-dimensional readouts
TLDR
It is demonstrated that pairs of readout neurons can transform the complex trajectory of transient states of a large neural circuit into a simple and clearly structured two-dimensional trajectory. Expand
Analog Computation via Neural Networks
TLDR
It is noted that these networks are not likely to solve polynomially NP-hard problems, as the equality “ p = np ” in the model implies the almost complete collapse of the standard polynomial hierarchy. Expand
Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks
TLDR
It is shown that a recurrent, second-order neural network using a real-time, forward training algorithm readily learns to infer small regular grammars from positive and negative string training samples, and many of the neural net state machines are dynamically stable, that is, they correctly classify many long unseen strings. Expand
What is a moment? Transient synchrony as a collective mechanism for spatiotemporal integration.
  • J. Hopfield, C. Brody
  • Computer Science, Medicine
  • Proceedings of the National Academy of Sciences of the United States of America
  • 2001
TLDR
The principles behind the operation of a network of simple integrate-and-fire neurons that contained output neurons selective for specific spatiotemporal patterns of inputs are presented and it is shown how the recognition is invariant to uniform time warp and uniform intensity change of the input events. Expand
Gradient calculations for dynamic recurrent neural networks: a survey
TLDR
The author discusses advantages and disadvantages of temporally continuous neural networks in contrast to clocked ones and presents some "tricks of the trade" for training, using, and simulating continuous time and recurrent neural networks. Expand
Temporal information transformed into a spatial code by a neural network with realistic properties
TLDR
It is demonstrated that known time-dependent neuronal properties enable a network to transform temporal information into a spatial code in a self-organizing manner, with no need to assume a spectrum of time delays or to custom-design the circuit. Expand
On the Computational Power of Winner-Take-All
  • W. Maass
  • Computer Science, Medicine
  • Neural Computation
  • 2000
TLDR
The theoretical analysis shows that winner- take-all is a surprisingly powerful computational module in comparison with threshold gates (also referred to as McCulloch-Pitts neurons) and sigmoidal gates, and proves an optimal quadratic lower bound for computing winner-takeall in any feedforward circuit consisting of threshold gates. Expand
Lower Bounds for the Computational Power of Networks of Spiking Neurons
  • W. Maass
  • Mathematics, Computer Science
  • Neural Computation
  • 1996
TLDR
It is shown that simple operations on phase differences between spike-trains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network of spiking neurons. Expand
t Synchrony Generation in Recurrent Networks with Frequency-Dependent Synapses
TLDR
It is found that the particular intensities of the external stimulus to specific neurons were crucial to evoke population bursts and therefore depends on the spectrum of basal discharge rates across the population and not on the anatomical individuality of the neurons, because this was random. Expand
...
1
2
3
4
5
...