Echo State Gaussian Process

  title={Echo State Gaussian Process},
  author={Sotirios P. Chatzis and Y. Demiris},
  journal={IEEE Transactions on Neural Networks},
  • S. ChatzisY. Demiris
  • Published 1 September 2011
  • Computer Science
  • IEEE Transactions on Neural Networks
Echo state networks (ESNs) constitute a novel approach to recurrent neural network (RNN) training, with an RNN (the reservoir) being generated randomly, and only a readout being trained using a simple computationally efficient algorithm. ESNs have greatly facilitated the practical application of RNNs, outperforming classical approaches on a number of benchmark tasks. In this paper, we introduce a novel Bayesian approach toward ESNs, the echo state Gaussian process (ESGP). The ESGP combines the… 

Figures and Tables from this paper

Multilayered Echo State Machine: A Novel Architecture and Algorithm

The addition of multiple layers of reservoirs are shown to provide a more robust alternative to conventional RC networks, and the comparative merits of this approach are demonstrated in a number of applications.

Iterative temporal learning and prediction with the sparse online echo state gaussian process

  • Harold SohY. Demiris
  • Computer Science
    The 2012 International Joint Conference on Neural Networks (IJCNN)
  • 2012
This work contributes the online echo state gaussian process (OESGP), a novel Bayesian-based online method that is capable of iteratively learning complex temporal dynamics and producing predictive distributions (instead of point predictions), and characterise the benefits and drawbacks associated with the considered online methods.

Subspace Echo State Network for Multivariate Time Series Prediction

A new approach towards ESNs, termed FSDESN, is introduced, which combines the merits of ESNs and fast subspace decomposition algorithm to provide a more precise alternative to conventional ESNs.

Modeling deterministic echo state network with loop reservoir

This paper proposes a simple deterministic ESN with a loop reservoir and proves that all the linear ESNs with the simplest loop reservoir possess the same memory capacity, arbitrarily converging to the optimal value.

Nonlinear System Modeling With Random Matrices: Echo State Networks Revisited

It is shown that the state transition mapping is contractive with high probability when only the necessary condition is satisfied, which corroborates and thus analytically explains the observation that in practice one obtains echo states when the spectral radius of the reservoir weight matrix is smaller than 1.

Predicting Multivariate Time Series Using Subspace Echo State Network

The core of the model is to utilize fast subspace decomposition algorithm for extracting a compact subspace out of a redundant large-scale reservoir matrix in order to remove approximate collinear components, overcome the ill-posed problem, and improve generalization performance.

Design of sparse Bayesian echo state network for time series prediction

The proposed SBESN attempts to estimate the probability of the outputs and trains the network through sparse Bayesian learning, where independent regularization priors should be implied to each weight rather than sharing one prior for all weights.

Spatio-Temporal Learning With the Online Finite and Infinite Echo-State Gaussian Processes

Adaptive systems that operate in environments where data arrives sequentially and is multivariate in nature, for example, sensory streams in robotic systems are concerned, particularly on problems with irrelevant dimensions.



Reservoir computing approaches to recurrent neural network training

Minimum Complexity Echo State Network

It is shown that a simple deterministically constructed cycle reservoir is comparable to the standard echo state network methodology and the (short-term) of linear cyclic reservoirs can be made arbitrarily close to the proved optimal value.

An experimental unification of reservoir computing methods

An Augmented Echo State Network for Nonlinear Adaptive Filtering of Complex Noncircular Signals

A novel complex echo state network (ESN), utilizing full second-order statistical information in the complex domain, is introduced. This is achieved through the use of the so-called augmented complex

Support Vector Echo-State Machine for Chaotic Time-Series Prediction

A novel chaotic time-series prediction method based on support vector machines (SVMs) and echo-state mechanisms is proposed, and its generalization ability and robustness are obtained by regularization operator and robust loss function.

Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.

Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication

We present a method for learning nonlinear systems, echo state networks (ESNs). ESNs employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning

Gaussian Process Dynamical Models for Human Motion

This work marginalize out the model parameters in closed form by using Gaussian process priors for both the dynamical and the observation mappings, which results in a nonparametric model for dynamical systems that accounts for uncertainty in the model.