Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication

  title={Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication},
  author={Herbert Jaeger and Harald Haas},
  pages={78 - 80}
We present a method for learning nonlinear systems, echo state networks (ESNs). ESNs employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning mechanism in biological brains. The learning method is computationally efficient and easy to use. On a benchmark task of predicting a chaotic time series, accuracy is improved by a factor of 2400 over previous techniques. The potential for engineering applications is illustrated by equalizing a… 
Reinforcement Learning in a Large Scale Photonic Network
Recurrent Neural Networks are nonlinear dynamical systems, and as such they show excellent performance in the prediction of chaotic trajectories or in the equalization of nonlinearly corrupted communication channels.
Physics-Informed Echo State Networks for Chaotic Systems Forecasting
The proposed framework shows the potential of using machine learning combined with prior physical knowledge to improve the time-accurate prediction of chaotic dynamical systems.
Long-term prediction of chaotic systems with recurrent neural networks
A scheme incorporating time-dependent but sparse data inputs into reservoir computing is articulate and it is demonstrated that such rare "updates" of the actual state practically enable an arbitrarily long prediction horizon for a variety of chaotic systems.
Training Echo Estate Neural Network Using Harmony Search Algorithm
Simulation results show that HS- ESN is significantly the fastest algorithm for training ESN whereas HSRLS-ESN algorithm can effectively meet the requirements of the output precision.
Synchronization of chaotic systems and their machine-learning models.
This work finds that a well-trained reservoir computer can synchronize with its learned chaotic systems by linking them with a common signal and shows that by sending just a scalar signal, one can achieve synchronism in trained reservoir computers and a cascading synchronization among chaotic systems and their fitted reservoir computers.
FPGA-Based Stochastic Echo State Networks for Time-Series Forecasting
This work shows a new approach to implement RC systems with digital gates based on the use of probabilistic computing concepts to reduce the hardware required to implement different arithmetic operations and results are the development of a highly functional system with low hardware resources.
Learning Ergodic Averages in Chaotic Systems
A physics-informed machine learning method to predict the time average of a chaotic attractor based on the hybrid echo state network (hESN), where the inclusion of a physical model significantly improves the accuracy of the prediction, reducing the relative error from 48% to 1%.
State Noise Effects on the Stochastic Gradient Descent Optimization Method for Echo State Networks with Leaky Integrator Neurons
The research is an empirical study of the stochastic gradient descent method leading to a better understanding of that optimization approach, its strengths and limitations, which results in more efficient applications of ESNs in all basic tasks of signal processing and control.
Harnessing Non-linearity by Sigmoid-wavelet Hybrid Echo State Networks (SWHESN)
In order to expand internal spatial spectrum of this ESN, this method transformed the original ESN into SWHESN (sigmoid-wavelet hybrid ESN) and amplifies the memory capacity (MC) of ESN meanwhile retaining its nonlinear feature via injecting some tuned wavelet neurons (wavelons).


Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.
'Neural-gas' network for vector quantization and its application to time-series prediction
It is shown that the dynamics of the reference (weight) vectors during the input-driven adaptation procedure are determined by the gradient of an energy function whose shape can be modulated through a neighborhood determining parameter and resemble the dynamicsof Brownian particles moving in a potential determined by a data point density.
A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal
A new evolutionary system for evolving artificial neural networks
The experimental results show that EPNet can produce very compact ANNs with good generalization ability in comparison with other algorithms, and has been tested on a number of benchmark problems in machine learning and ANNs.
Neural networks that learn temporal sequences by selection.
A network architecture composed of three layers of neuronal clusters is shown to exhibit active recognition and learning of time sequences by selection: the network spontaneously produces prerepresentations that are selected according to their resonance with the input percepts.
Oscillation and chaos in physiological control systems.
First-order nonlinear differential-delay equations describing physiological control systems displaying a broad diversity of dynamical behavior including limit cycle oscillations, with a variety of wave forms, and apparently aperiodic or "chaotic" solutions are studied.
Dynamical Working Memory and Timed Responses: The Role of Reverberating Loops in the Olivo-Cerebellar System
It is proposed that the irregularity observed in the firing pattern of the IO neurons is not necessarily produced by noise but can instead be the result of a purely deterministic network effect that can serve as a dynamical working memory or as a neuronal clock with a characteristic timescale of about 100 milliseconds.
In this paper we describe the winning entry of the time-series prediction competition which was part of the International Workshop on Advanced Black-Box Techniques for Nonlinear Modeling, held at K.