Reservoir Computing Trends

@article{Lukoeviius2012ReservoirCT,
  title={Reservoir Computing Trends},
  author={Mantas Luko{\vs}evi{\vc}ius and Herbert Jaeger and Benjamin Schrauwen},
  journal={KI - K{\"u}nstliche Intelligenz},
  year={2012},
  volume={26},
  pages={365-371}
}
Reservoir Computing (RC) is a paradigm of understanding and training Recurrent Neural Networks (RNNs) based on treating the recurrent part (the reservoir) differently than the readouts from it. It started ten years ago and is currently a prolific research area, giving important insights into RNNs, practical machine learning tools, as well as enabling computation with non-conventional hardware. Here we give a brief introduction into basic concepts, methods, insights, current developments, and… 
Frontiers in Reservoir Computing
TLDR
An overview on the RC research field is given, highlighting major frontiers in its development and finally introducing the contributed papers to the ESANN 2020 special session.
A Practical Guide to Applying Echo State Networks
TLDR
Practical techniques and recommendations for successfully applying Echo State Network, as well as some more advanced application-specific modifications are presented.
Recent Advances in Physical Reservoir Computing: A Review
Product reservoir computing: Time-series computation with multiplicative neurons
TLDR
This study introduces a RC architecture with a reservoir of product nodes for time series computation and finds that the product RC shows many properties of standard ESN such as short-term memory and nonlinear capacity.
A Primer on Reservoir Computing
TLDR
This paper is a brief primer on concepts in reservoir computing (RC), a class of artificial neural network models that mimic neural microcircuits in the biological brain using an un-trained reservoir of neurons and a trained readout function.
A Comparative Study of Reservoir Computing for Temporal Signal Processing
TLDR
It is found that the role of the reservoir in the reservoir computing paradigm goes beyond providing a memory of the past inputs, and the DL and the NARX network have higher memorization capability, but fall short of the generalization power of the ESN.
Evolving Functionally Equivalent Reservoirs for RBN Reservoir Computing Systems
TLDR
This paper investigates the dynamics, performance, and viability of RBNs used for Reservoir Computing (RRC), and finds a one-to-many mapping between the readout layer in an already-trained RRC system and different RBN reservoirs.
Low-cost hardware implementation of Reservoir Computers
TLDR
This work reviews the basic principles of stochastic logic and its application to the hardware implementation of Neural Networks and focuses on the implementation of the recently introduced Reservoir Computer architecture.
Exploring Physical Reservoir Computing using Random Boolean Networks.
TLDR
Simulation of RBN RC systems can aid in deciding the optimal size of physical reservoirs, given a bridge between the computational power of the reservoir and RBNs can be deduced.
Stochastic-Based Implementation of Reservoir Computers
TLDR
This work presents an efficient approach to implement Reservoir computing and employs probabilistic logic to reduce the hardware area required to implement the arithmetic operations present in neural networks and conventional binary logic for the nonlinear activation function.
...
...

References

SHOWING 1-10 OF 66 REFERENCES
Reservoir computing approaches to recurrent neural network training
Reservoir Computing and Self-Organized Neural Hierarchies
TLDR
This thesis overviews existing and investigates new alternatives to the classical supervised training of RNNs and their hierarchies and proposes and investigates the use of two different neural network models for the reservoirs together with several unsupervised adaptation techniques, as well as un supervisedly layer-wise trained deep hierarchies of such models.
An experimental unification of reservoir computing methods
Recurrent Kernel Machines: Computing with Infinite Echo State Networks
TLDR
The concept of ESNs is extended to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines.
Optoelectronic Reservoir Computing
TLDR
This work reports an optoelectronic implementation of reservoir computing based on a recently proposed architecture consisting of a single non linear node and a delay line that is sufficiently fast for real time information processing.
Learning long-term dependencies with gradient descent is difficult
TLDR
This work shows why gradient based learning algorithms face an increasingly difficult problem as the duration of the dependencies to be captured increases, and exposes a trade-off between efficient learning by gradient descent and latching on information for long periods.
Analyzing the weight dynamics of recurrent learning algorithms
A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal
New results on recurrent network training: unifying the algorithms and accelerating convergence
TLDR
An on-line version of the proposed algorithm, which is based on approximating the error gradient, has lower computational complexity in computing the weight update than the competing techniques for most typical problems and reaches the error minimum in a much smaller number of iterations.
Learning Recurrent Neural Networks with Hessian-Free Optimization
TLDR
This work solves the long-outstanding problem of how to effectively train recurrent neural networks on complex and difficult sequence modeling problems which may contain long-term data dependencies and offers a new interpretation of the generalized Gauss-Newton matrix of Schraudolph which is used within the HF approach of Martens.
...
...