• Publications
  • Influence
The''echo state''approach to analysing and training recurrent neural networks
The report introduces a constructive learning algorithm for recurrent neural networks, which modifies only the weights to output units in order to achieve the learning task. key words: recurrent
Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication
We present a method for learning nonlinear systems, echo state networks (ESNs). ESNs employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning
Reservoir computing approaches to recurrent neural network training
TLDR
This review systematically surveys both current ways of generating/adapting the reservoirs and training different types of readouts, and offers a natural conceptual classification of the techniques, which transcends boundaries of the current ''brand-names'' of reservoir methods.
A tutorial on training recurrent neural networks , covering BPPT , RTRL , EKF and the " echo state network " approach - Semantic Scholar
TLDR
This tutorial is a worked-out version of a 5-hour course originally held at AIS in September/October 2002, and contains a mathematically-oriented crash course on traditional training methods for recurrent neural networks, covering back-propagation through time, real-time recurrent learning (RTRL), and extended Kalman filtering approaches (EKF).
Adaptive Nonlinear System Identification with Echo State Networks
  • H. Jaeger
  • Computer Science, Mathematics
    NIPS
  • 2002
TLDR
An online adaptation scheme based on the RLS algorithm known from adaptive linear systems is described, as an example, a 10-th order NARMA system is adaptively identified.
Optimization and applications of echo state networks with leaky- integrator neurons
TLDR
Stability conditions are presented, a stochastic gradient descent method is introduced and a usefulness of leaky-integrator ESNs are demonstrated for learning very slow dynamic systems and replaying the learnt system at different speeds.
Observable Operator Models for Discrete Stochastic Time Series
  • H. Jaeger
  • Mathematics, Medicine
    Neural Computation
  • 1 June 2000
TLDR
A novel, simple characterization of linearly dependent processes, called observable operator models, is provided, which leads to a constructive learning algorithm for the identification of linially dependent processes.
Controlling Recurrent Neural Networks by Conceptors
TLDR
A mechanism of neurodynamical organization, called conceptors, is proposed, which unites nonlinear dynamics with basic principles of conceptual abstraction and logic, and helps explain how conceptual-level information processing emerges naturally and robustly in neural systems.
Re-visiting the echo state property
TLDR
Analytical examples are used to show that a widely used criterion for the ESP, the spectral radius of the weight matrix being smaller than unity, is not sufficient to satisfy the echo state property.
Echo state network
  • H. Jaeger
  • Computer Science, Physics
    Scholarpedia
  • 6 September 2007
...
1
2
3
4
5
...