Reservoir computing approaches to recurrent neural network training

@article{Lukoeviius2009ReservoirCA,
  title={Reservoir computing approaches to recurrent neural network training},
  author={Mantas Luko{\vs}evi{\vc}ius and Herbert Jaeger},
  journal={Comput. Sci. Rev.},
  year={2009},
  volume={3},
  pages={127-149}
}

Figures from this paper

Reservoir Computing and Self-Organized Neural Hierarchies
TLDR
This thesis overviews existing and investigates new alternatives to the classical supervised training of RNNs and their hierarchies and proposes and investigates the use of two different neural network models for the reservoirs together with several unsupervised adaptation techniques, as well as un supervisedly layer-wise trained deep hierarchies of such models.
Recent Advances in Physical Reservoir Computing: A Review
Regular echo state networks: simple and accurate reservoir models to real-world applications
TLDR
The results revealed that some problems can be considerably benefited from some level of organization in the reservoir, such as those provided by regular or small-world network models; and that the non-linear support vector machine classifier achieved the best predictive performance, although it was statistically comparable with the k-nearest neighbors one, which has much smaller time complexity.
Evolving reservoir weights in the frequency domain
TLDR
This work introduces an evolutionary method for adjusting the reservoir non-null weights, called EvoESN (Evolutionary ESN), which combines an evolutionary search in the Fourier space with supervised learning for the readout weights.
Simple Deterministically Constructed Recurrent Neural Networks
TLDR
It is shown that a very simple deterministically constructed reservoir with simple cycle topology gives performances comparable to those of the Echo State Network on a number of time series benchmarks, and argued that the memory capacity of such a model can be made arbitrarily close to the proved theoretical limit.
Reservoir Computing Trends
TLDR
A brief introduction into basic concepts, methods, insights, current developments, and some applications of RC are given.
Multilayered Echo State Machine: A Novel Architecture and Algorithm
TLDR
The addition of multiple layers of reservoirs are shown to provide a more robust alternative to conventional RC networks, and the comparative merits of this approach are demonstrated in a number of applications.
Evolutionary strategy for simultaneous optimization of parameters, topology and reservoir weights in Echo State Networks
TLDR
This paper presents an original investigation of an evolutionary method for simultaneous optimization of parameters, topology and reservoir weights in Echo State Networks.
An approach to reservoir computing design and training
Recurrent Kernel Machines: Computing with Infinite Echo State Networks
TLDR
The concept of ESNs is extended to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 218 REFERENCES
Overview of Reservoir Recipes
TLDR
This report motivates the new definition of the paradigm and surveys the reservoir generation/adaptation techniques, offering a natural conceptual classification which transcends boundaries of the current "brand-names" of reservoir methods.
Echo State Networks with Trained Feedbacks
TLDR
This report explores possible directions in which the theoretical findings could be applied to increase the computational power of Echo State Networks and proposes a modification of ESNs called Layered ESNs.
An experimental unification of reservoir computing methods
An overview of reservoir computing: theory, applications and implementations
TLDR
This tutorial will give an overview of current research on theory, applica- tion and implementations of Reservoir Computing, which makes it possible to solve complex tasks using just linear post-processing techniques.
Pruning and Regularisation in Reservoir Computing: a First Insight
TLDR
This work proposes to study how pruning some connections from the reservoir to the readout can help to increase the generalisation ability, in much the same way as regularisation techniques do, and to improve the implementability of reservoirs in hardware.
Feed-forward echo state networks
  • M. Čerňanský, M. Makula
  • Computer Science
    Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.
  • 2005
TLDR
This work proposes modified ESN architecture, where the only "true" recurrent connections are backward connection from output to recurrent units and the reservoir is built only by "forwardly" connected recurrent units.
Training Recurrent Networks by Evolino
TLDR
It is shown that Evolino-based LSTM can solve tasks that Echo State nets cannot and achieves higher accuracy in certain continuous function generation tasks than conventional gradient descent RNNs, including gradient-basedLSTM.
Echo State Networks and Self-Prediction
TLDR
Preliminary results indicate that self prediction may improve performance of an ESN when performing signal mappings in the presence of additive noise.
Improving the Prediction Accuracy of Echo State Neural Networks by Anti-Oja's Learning
TLDR
This regular adaptation of Echo State neural networks was optimized by updating the weights of the dynamic reservoir with Anti-Oja's learning, which resulted in prediction error being substantially smaller in comparison with prediction error achieved by a standard algorithm.
...
1
2
3
4
5
...