Corpus ID: 219636027

Reservoir Computing meets Recurrent Kernels and Structured Transforms

@article{Dong2020ReservoirCM,
  title={Reservoir Computing meets Recurrent Kernels and Structured Transforms},
  author={Jonathan Dong and Ruben Ohana and M. Rafayelyan and F. Krzakala},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.07310}
}
Reservoir Computing is a class of simple yet efficient Recurrent Neural Networks where internal weights are fixed at random and only a linear output layer is trained. In the large size limit, such random neural networks have a deep connection with kernel methods. Our contributions are threefold: a) We rigorously establish the recurrent kernel limit of Reservoir Computing and prove its convergence. b) We test our models on chaotic time series prediction, a classic but challenging benchmark in… Expand
Population Codes Enable Learning from Few Examples By Shaping Inductive Bias
TLDR
This study considers biologically-plausible reading out of arbitrary stimulus-response maps from arbitrary population codes, and develops an analytical theory that predicts the generalization error of the readout as a function of the number of examples, suggesting sample-efficient learning as a general normative coding principle. Expand
Unsupervised Reservoir Computing for Solving Ordinary Differential Equations
There is a wave of interest in using unsupervised neural networks for solving differential equations. The existing methods are based on feed-forward networks, while recurrent neural networkExpand

References

SHOWING 1-10 OF 48 REFERENCES
Recurrent Kernel Machines: Computing with Infinite Echo State Networks
TLDR
The concept of ESNs is extended to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines. Expand
Recent Advances in Physical Reservoir Computing: A Review
TLDR
An overview of recent advances in physical reservoir computing is provided by classifying them according to the type of the reservoir to expand its practical applications and develop next-generation machine learning systems. Expand
Forecasting of Spatio-temporal Chaotic Dynamics with Recurrent Neural Networks: a comparative study of Reservoir Computing and Backpropagation Algorithms
TLDR
This study confirms that RNNs present a potent computational framework for the forecasting of complex spatio-temporal dynamics. Expand
An experimental unification of reservoir computing methods
TLDR
Three different uses of a recurrent neural network as a reservoir that is not trained but instead read out by a simple external classification layer are compared and a new measure for the reservoir dynamics based on Lyapunov exponents is introduced. Expand
Unitary Evolution Recurrent Neural Networks
TLDR
This work constructs an expressive unitary weight matrix by composing several structured matrices that act as building blocks with parameters to be learned, and demonstrates the potential of this architecture by achieving state of the art results in several hard tasks involving very long-term dependencies. Expand
Reservoir Computing with Untrained Convolutional Neural Networks for Image Recognition
TLDR
This work uses an untrained convolutional neural network to transform raw image data into a set of smaller feature maps in a preprocessing step of the reservoir computing, and demonstrates that this method achieves a high classification accuracy in an image recognition task with a much smaller number of trainable parameters. Expand
Reservoir computing approaches to recurrent neural network training
TLDR
This review systematically surveys both current ways of generating/adapting the reservoirs and training different types of readouts, and offers a natural conceptual classification of the techniques, which transcends boundaries of the current ''brand-names'' of reservoir methods. Expand
Reinforcement learning with convolutional reservoir computing
TLDR
This study proposes a novel practical approach called reinforcement learning with a convolutional reservoir computing (RCRC) model, which uses a fixed random-weight CNN and a reservoir computing model to extract visual and time-series features and can solve multiple reinforcement learning tasks with a completely identical feature extractor. Expand
Recurrent Kernel Networks
TLDR
This work generalizes convolutional kernel networks to model gaps in sequences, resulting in a new type of recurrent neural network which can be trained end-to-end with backpropagation, or without supervision by using kernel approximation techniques. Expand
Kernel-Based Approaches for Sequence Modeling: Connections to Neural Methods
TLDR
By considering dynamic gating of the memory cell, a model closely related to the long short-term memory (LSTM) recurrent neural network is derived and its variants perform on par or even better than traditional neural methods. Expand
...
1
2
3
4
5
...