Reservoir Computing Trends
@article{Lukoeviius2012ReservoirCT, title={Reservoir Computing Trends}, author={Mantas Luko{\vs}evi{\vc}ius and Herbert Jaeger and Benjamin Schrauwen}, journal={KI - K{\"u}nstliche Intelligenz}, year={2012}, volume={26}, pages={365-371} }
Reservoir Computing (RC) is a paradigm of understanding and training Recurrent Neural Networks (RNNs) based on treating the recurrent part (the reservoir) differently than the readouts from it. It started ten years ago and is currently a prolific research area, giving important insights into RNNs, practical machine learning tools, as well as enabling computation with non-conventional hardware. Here we give a brief introduction into basic concepts, methods, insights, current developments, and…
276 Citations
Frontiers in Reservoir Computing
- Computer ScienceESANN
- 2020
An overview on the RC research field is given, highlighting major frontiers in its development and finally introducing the contributed papers to the ESANN 2020 special session.
A Practical Guide to Applying Echo State Networks
- Computer ScienceNeural Networks: Tricks of the Trade
- 2012
Practical techniques and recommendations for successfully applying Echo State Network, as well as some more advanced application-specific modifications are presented.
Recent Advances in Physical Reservoir Computing: A Review
- Computer ScienceNeural Networks
- 2019
Product reservoir computing: Time-series computation with multiplicative neurons
- Computer Science2015 International Joint Conference on Neural Networks (IJCNN)
- 2015
This study introduces a RC architecture with a reservoir of product nodes for time series computation and finds that the product RC shows many properties of standard ESN such as short-term memory and nonlinear capacity.
A Primer on Reservoir Computing
- Computer Science
- 2016
This paper is a brief primer on concepts in reservoir computing (RC), a class of artificial neural network models that mimic neural microcircuits in the biological brain using an un-trained reservoir of neurons and a trained readout function.
A Comparative Study of Reservoir Computing for Temporal Signal Processing
- Computer ScienceArXiv
- 2014
It is found that the role of the reservoir in the reservoir computing paradigm goes beyond providing a memory of the past inputs, and the DL and the NARX network have higher memorization capability, but fall short of the generalization power of the ESN.
Evolving Functionally Equivalent Reservoirs for RBN Reservoir Computing Systems
- Computer Science
- 2015
This paper investigates the dynamics, performance, and viability of RBNs used for Reservoir Computing (RRC), and finds a one-to-many mapping between the readout layer in an already-trained RRC system and different RBN reservoirs.
Low-cost hardware implementation of Reservoir Computers
- Computer Science2014 24th International Workshop on Power and Timing Modeling, Optimization and Simulation (PATMOS)
- 2014
This work reviews the basic principles of stochastic logic and its application to the hardware implementation of Neural Networks and focuses on the implementation of the recently introduced Reservoir Computer architecture.
Exploring Physical Reservoir Computing using Random Boolean Networks.
- Geology
- 2016
Simulation of RBN RC systems can aid in deciding the optimal size of physical reservoirs, given a bridge between the computational power of the reservoir and RBNs can be deduced.
Stochastic-Based Implementation of Reservoir Computers
- Computer ScienceIWANN
- 2015
This work presents an efficient approach to implement Reservoir computing and employs probabilistic logic to reduce the hardware area required to implement the arithmetic operations present in neural networks and conventional binary logic for the nonlinear activation function.
References
SHOWING 1-10 OF 66 REFERENCES
Reservoir computing approaches to recurrent neural network training
- Computer ScienceComput. Sci. Rev.
- 2009
Reservoir Computing and Self-Organized Neural Hierarchies
- Computer Science
- 2012
This thesis overviews existing and investigates new alternatives to the classical supervised training of RNNs and their hierarchies and proposes and investigates the use of two different neural network models for the reservoirs together with several unsupervised adaptation techniques, as well as un supervisedly layer-wise trained deep hierarchies of such models.
Recurrent Kernel Machines: Computing with Infinite Echo State Networks
- Computer ScienceNeural Computation
- 2012
The concept of ESNs is extended to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines.
Optoelectronic Reservoir Computing
- Computer ScienceScientific reports
- 2012
This work reports an optoelectronic implementation of reservoir computing based on a recently proposed architecture consisting of a single non linear node and a delay line that is sufficiently fast for real time information processing.
Learning long-term dependencies with gradient descent is difficult
- Computer ScienceIEEE Trans. Neural Networks
- 1994
This work shows why gradient based learning algorithms face an increasingly difficult problem as the duration of the dependencies to be captured increases, and exposes a trade-off between efficient learning by gradient descent and latching on information for long periods.
Analyzing the weight dynamics of recurrent learning algorithms
- Computer ScienceNeurocomputing
- 2005
A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
- Computer ScienceNeural Computation
- 1989
The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal…
New results on recurrent network training: unifying the algorithms and accelerating convergence
- Computer ScienceIEEE Trans. Neural Networks Learn. Syst.
- 2000
An on-line version of the proposed algorithm, which is based on approximating the error gradient, has lower computational complexity in computing the weight update than the competing techniques for most typical problems and reaches the error minimum in a much smaller number of iterations.
Learning Recurrent Neural Networks with Hessian-Free Optimization
- Computer ScienceICML
- 2011
This work solves the long-outstanding problem of how to effectively train recurrent neural networks on complex and difficult sequence modeling problems which may contain long-term data dependencies and offers a new interpretation of the generalized Gauss-Newton matrix of Schraudolph which is used within the HF approach of Martens.