# Information processing in echo state networks at the edge of chaos

@article{Boedecker2011InformationPI, title={Information processing in echo state networks at the edge of chaos}, author={J. Boedecker and Oliver Obst and J. Lizier and N. Mayer and M. Asada}, journal={Theory in Biosciences}, year={2011}, volume={131}, pages={205-213} }

We investigate information processing in randomly connected recurrent neural networks. It has been shown previously that the computational capabilities of these networks are maximized when the recurrent layer is close to the border between a stable and an unstable dynamics regime, the so called edge of chaos. The reasons, however, for this maximized performance are not completely understood. We adopt an information-theoretical framework and are for the first time able to quantify the… Expand

#### 158 Citations

Memory Capacity of Input-Driven Echo State Networks at the Edge of Chaos

- Computer Science
- ICANN
- 2014

This paper takes a closer look at short-term memory capacity, introduced by Jaeger in case of echo state networks, and investigates the effect of reservoir sparsity in this context. Expand

Neuroevolution on the edge of chaos

- Computer Science
- GECCO
- 2017

It is demonstrated that echo state networks with local connections combine the best of both worlds, the simplicity of random echo state Networks and the performance of evolved networks, and it is shown that evolution tends to stay close to the ordered side of the edge of chaos. Expand

Optimal Input Representation in Neural Systems at the Edge of Chaos

- Physics, Computer Science
- Biology
- 2021

It is concluded that operating near criticality can also have —besides the usually alleged virtues— the advantage of allowing for flexible, robust and efficient input representations. Expand

Evaluation of Information-Theoretic Measures in Echo State Networks on the Edge of Stability

- Computer Science
- 2018 International Joint Conference on Neural Networks (IJCNN)
- 2018

This paper takes a closer look at information-theoretic measures of echo state networks, using Kraskov-Grassberger-Stögbauer estimator with optimized parameters, and investigates the effect of reservoir orthogonalization, that has been shown earlier to maximize memory capacity, on the prediction accuracy and the above mentioned measures. Expand

Optimal short-term memory before the edge of chaos in driven random recurrent networks

- Computer Science, Physics
- Physical review. E
- 2019

The ability of discrete-time nonlinear recurrent neural networks to store time-varying small input signals is investigated with mean-field theory and it is shown that the network contribution of these short-term memory measures peaks before the edge of chaos. Expand

Optimal sequence memory in driven random networks

- Computer Science
- 2016

This work investigates the effect of a time-varying input on the onset of chaos and the resulting consequences for information processing, finding an exact condition that determines the transition from stable to chaotic dynamics and the sequential memory capacity in closed form. Expand

Recurrence-based information processing in gene regulatory networks.

- Computer Science, Medicine
- Chaos
- 2018

The computational capabilities of the transcriptional regulatory networks of five evolutionary distant organisms are studied, suggesting that recurrent nonlinear dynamics is a key element for the processing of complex time-dependent information by cells. Expand

Investigating echo state networks dynamics by means of recurrence analysis

- Computer Science, Physics
- IEEE Trans. Neural Networks Learn. Syst.
- 2018

This paper analyzes time series of neuron activations with recurrence plots (RPs) and recurrence quantification analysis (RQA), which permit to visualize and characterize high-dimensional dynamical systems. Expand

Determination of the Edge of Criticality in Echo State Networks Through Fisher Information Maximization

- Physics, Computer Science
- IEEE Transactions on Neural Networks and Learning Systems
- 2018

This paper takes advantage of a recently developed nonparametric estimator of the Fisher information matrix and provides a method to determine the critical region of echo state networks (ESNs), a particular class of recurrent networks. Expand

Recurrence-Based Information Processing in Gene Regulatory Networks

- Computer Science, Biology
- 2014

It is proposed that cells can perform such dynamical information processing via the reservoir computing paradigm, and suggests that recurrent dynamics is a key element for the processing of complex time-dependent information by cells. Expand

#### References

SHOWING 1-10 OF 56 REFERENCES

Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks

- Computer Science, Medicine
- Neural Computation
- 2004

It is shown that only near the critical boundary can recurrent networks of threshold gates perform complex computations on time series, which strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos. Expand

Dynamical synapses causing self-organized criticality in neural networks

- Physics, Biology
- 2007

It is demonstrated analytically and numerically that by assuming (biologically more realistic) dynamical synapses in a spiking neural network, the neuronal avalanches turn from an exceptional phenomenon into a typical and robust self-organized critical behaviour, if the total resources of neurotransmitter are sufficiently large. Expand

SORN: A Self-Organizing Recurrent Neural Network

- Computer Science, Medicine
- Front. Comput. Neurosci.
- 2009

This work introduces SORN, a self-organizing recurrent network that combines three distinct forms of local plasticity to learn spatio-temporal patterns in its input while maintaining its dynamics in a healthy regime suitable for learning. Expand

Edge of chaos and prediction of computational performance for neural circuit models

- Computer Science, Medicine
- Neural Networks
- 2007

This article finds that the edge of chaos predicts quite well those values of circuit parameters that yield maximal computational performance, but obviously it makes no prediction of their computational performance for other parameter values, and proposes a new method for predicting the computational performance of neural microcircuit models. Expand

Connectivity, Dynamics, and Memory in Reservoir Computing with Binary and Analog Neurons

- Computer Science, Mathematics
- Neural Computation
- 2010

Investigating the influence of the network connectivity (parameterized by the neuron in-degree) on a family of network models that interpolates between analog and binary networks reveals that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, leading to decreased computational performance observed in binary circuits that are densely connected. Expand

Initialization and self‐organized optimization of recurrent neural network connectivity

- Computer Science, Medicine
- HFSP journal
- 2009

A general network initialization method using permutation matrices is studied and a new unsupervised learning rule based on intrinsic plasticity (IP) is derived to improve network performance in a self‐organized way. Expand

Computation at the edge of chaos: phase transitions and emergent computation

- Computer Science, Physics
- 1990

There is a fundamental connection between computation and phase transitions, especially second-order or “critical” transitions, and some of the implications for the understanding of nature if such a connection is borne out are discussed. Expand

Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

- Computer Science, Medicine
- Neural Computation
- 2002

A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. Expand

The Information Dynamics of Phase Transitions in Random Boolean Networks

- Computer Science
- ALIFE
- 2008

This work uses a recently published framework to characterize the distributed computation in terms of its underlying information dynamics: information storage, information transfer and information modification, and finds maximizations in information storage and coherent information transfer on either side of the critical point. Expand

Information flow in local cortical networks is not democratic

- Computer Science
- BMC Neuroscience
- 2008

Surprisingly, the analysis revealed wide differences in the amount of information flowing into and out of different neurons in the network, indicating that information flow is not "democratically" distributed. Expand