Information processing in echo state networks at the edge of chaos

@article{Boedecker2011InformationPI,
  title={Information processing in echo state networks at the edge of chaos},
  author={J. Boedecker and Oliver Obst and J. Lizier and N. Mayer and M. Asada},
  journal={Theory in Biosciences},
  year={2011},
  volume={131},
  pages={205-213}
}
We investigate information processing in randomly connected recurrent neural networks. It has been shown previously that the computational capabilities of these networks are maximized when the recurrent layer is close to the border between a stable and an unstable dynamics regime, the so called edge of chaos. The reasons, however, for this maximized performance are not completely understood. We adopt an information-theoretical framework and are for the first time able to quantify the… Expand
Memory Capacity of Input-Driven Echo State Networks at the Edge of Chaos
TLDR
This paper takes a closer look at short-term memory capacity, introduced by Jaeger in case of echo state networks, and investigates the effect of reservoir sparsity in this context. Expand
Neuroevolution on the edge of chaos
TLDR
It is demonstrated that echo state networks with local connections combine the best of both worlds, the simplicity of random echo state Networks and the performance of evolved networks, and it is shown that evolution tends to stay close to the ordered side of the edge of chaos. Expand
Optimal Input Representation in Neural Systems at the Edge of Chaos
TLDR
It is concluded that operating near criticality can also have —besides the usually alleged virtues— the advantage of allowing for flexible, robust and efficient input representations. Expand
Evaluation of Information-Theoretic Measures in Echo State Networks on the Edge of Stability
  • M. Torda, I. Farkas
  • Computer Science
  • 2018 International Joint Conference on Neural Networks (IJCNN)
  • 2018
TLDR
This paper takes a closer look at information-theoretic measures of echo state networks, using Kraskov-Grassberger-Stögbauer estimator with optimized parameters, and investigates the effect of reservoir orthogonalization, that has been shown earlier to maximize memory capacity, on the prediction accuracy and the above mentioned measures. Expand
Optimal short-term memory before the edge of chaos in driven random recurrent networks
TLDR
The ability of discrete-time nonlinear recurrent neural networks to store time-varying small input signals is investigated with mean-field theory and it is shown that the network contribution of these short-term memory measures peaks before the edge of chaos. Expand
Optimal sequence memory in driven random networks
TLDR
This work investigates the effect of a time-varying input on the onset of chaos and the resulting consequences for information processing, finding an exact condition that determines the transition from stable to chaotic dynamics and the sequential memory capacity in closed form. Expand
Recurrence-based information processing in gene regulatory networks.
TLDR
The computational capabilities of the transcriptional regulatory networks of five evolutionary distant organisms are studied, suggesting that recurrent nonlinear dynamics is a key element for the processing of complex time-dependent information by cells. Expand
Investigating echo state networks dynamics by means of recurrence analysis
TLDR
This paper analyzes time series of neuron activations with recurrence plots (RPs) and recurrence quantification analysis (RQA), which permit to visualize and characterize high-dimensional dynamical systems. Expand
Determination of the Edge of Criticality in Echo State Networks Through Fisher Information Maximization
TLDR
This paper takes advantage of a recently developed nonparametric estimator of the Fisher information matrix and provides a method to determine the critical region of echo state networks (ESNs), a particular class of recurrent networks. Expand
Recurrence-Based Information Processing in Gene Regulatory Networks
TLDR
It is proposed that cells can perform such dynamical information processing via the reservoir computing paradigm, and suggests that recurrent dynamics is a key element for the processing of complex time-dependent information by cells. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 56 REFERENCES
Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
TLDR
It is shown that only near the critical boundary can recurrent networks of threshold gates perform complex computations on time series, which strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos. Expand
Dynamical synapses causing self-organized criticality in neural networks
TLDR
It is demonstrated analytically and numerically that by assuming (biologically more realistic) dynamical synapses in a spiking neural network, the neuronal avalanches turn from an exceptional phenomenon into a typical and robust self-organized critical behaviour, if the total resources of neurotransmitter are sufficiently large. Expand
SORN: A Self-Organizing Recurrent Neural Network
TLDR
This work introduces SORN, a self-organizing recurrent network that combines three distinct forms of local plasticity to learn spatio-temporal patterns in its input while maintaining its dynamics in a healthy regime suitable for learning. Expand
Edge of chaos and prediction of computational performance for neural circuit models
TLDR
This article finds that the edge of chaos predicts quite well those values of circuit parameters that yield maximal computational performance, but obviously it makes no prediction of their computational performance for other parameter values, and proposes a new method for predicting the computational performance of neural microcircuit models. Expand
Connectivity, Dynamics, and Memory in Reservoir Computing with Binary and Analog Neurons
TLDR
Investigating the influence of the network connectivity (parameterized by the neuron in-degree) on a family of network models that interpolates between analog and binary networks reveals that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, leading to decreased computational performance observed in binary circuits that are densely connected. Expand
Initialization and self‐organized optimization of recurrent neural network connectivity
TLDR
A general network initialization method using permutation matrices is studied and a new unsupervised learning rule based on intrinsic plasticity (IP) is derived to improve network performance in a self‐organized way. Expand
Computation at the edge of chaos: phase transitions and emergent computation
TLDR
There is a fundamental connection between computation and phase transitions, especially second-order or “critical” transitions, and some of the implications for the understanding of nature if such a connection is borne out are discussed. Expand
Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
TLDR
A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. Expand
The Information Dynamics of Phase Transitions in Random Boolean Networks
TLDR
This work uses a recently published framework to characterize the distributed computation in terms of its underlying information dynamics: information storage, information transfer and information modification, and finds maximizations in information storage and coherent information transfer on either side of the critical point. Expand
Information flow in local cortical networks is not democratic
TLDR
Surprisingly, the analysis revealed wide differences in the amount of information flowing into and out of different neurons in the network, indicating that information flow is not "democratically" distributed. Expand
...
1
2
3
4
5
...