• Corpus ID: 14113946

Reservoir Computing and Self-Organized Neural Hierarchies

@inproceedings{Lukoeviius2012ReservoirCA,
  title={Reservoir Computing and Self-Organized Neural Hierarchies},
  author={Mantas Luko{\vs}evi{\vc}ius},
  year={2012}
}
There is a growing understanding that machine learning architectures have to be much bigger and more complex to approach any intelligent behavior. There is also a growing understanding that purely supervised learning is inadequate to train such systems. A recent paradigm of artificial recurrent neural network (RNN) training under the umbrella-name Reservoir Computing (RC) demonstrated that training big recurrent networks (the reservoirs) differently than supervised readouts from them is often… 
Pre-trainable Reservoir Computing with Recursive Neural Gas
TLDR
An accurate model of RNG is described together with some extensions to the models presented in the literature and comparative results on three well-known and accepted datasets show that, under specific circumstances, RNG-based reservoirs can achieve better performance.
Self-organized Reservoirs and Their Hierarchies
TLDR
Unsupervised greedy bottom-up trained hierarchies of recurrent neural networks and their deep hierarchies are shown being capable of big performance improvements over single layer setups.
Evolutionary Echo State Network: evolving reservoirs in the Fourier space
TLDR
A new computational model of the ESN type, that represents the reservoir weights in the Fourier space and performs a fine-tuning of these weights applying genetic algorithms in the frequency domain is proposed, thus providing a dimensionality reduction transformation of the initial method.
Reservoir Computing Trends
TLDR
A brief introduction into basic concepts, methods, insights, current developments, and some applications of RC are given.
Growing Echo-State Network With Multiple Subreservoirs
TLDR
Simulation results show that the proposed GESN has better prediction performance and faster leaning speed than some ESNs with fixed sizes and topologies.
A deep learning based approach for analog hardware implementation of delayed feedback reservoir computing system
TLDR
This work presents an analog hardware implementation of the delayed feedback reservoir (DFR) computing model, which offers the ease of hardware implementation but also enables the optimal performance contributed by the inherent delay and its rich intrinsic dynamics.
Performance optimization of echo state networks through principal neuron reinforcement
TLDR
A neuroplasticity-inspired algorithm was proposed in this study to alter the strength of internal synapses within the reservoir towards the goal of optimizing the neuronal dynamics of the ESN pertaining to the specific problem to be solved.
Quantifying the Reservoir Quality using Dimensionality Reduction Techniques
TLDR
A correlation analysis between the input space and the feature map is presented and a correlation between the Sammon energy and the model accuracy is shown, which can be useful for defining good reservoir topologies.
A Practical Guide to Applying Echo State Networks
TLDR
Practical techniques and recommendations for successfully applying Echo State Network, as well as some more advanced application-specific modifications are presented.
Robust forecasting using predictive generalized synchronization in reservoir computing.
TLDR
A method based on predictive generalized synchronization (PGS) is analyzed that gives direction in designing and evaluating the architecture and hyperparameters of an RC and provides a metric for evaluating the RC using the reproduction of the input system's Lyapunov exponents that demonstrates robustness in prediction.
...
...

References

SHOWING 1-10 OF 234 REFERENCES
On self-organizing reservoirs and their hierarchies
TLDR
This work demonstrates in a rigorous way the advantage of using the self-organizing reservoirs over the traditional random ones and using hierarchies of such over single reservoirs with a synthetic handwriting-like temporal pattern recognition dataset.
Reservoir computing approaches to recurrent neural network training
Overview of Reservoir Recipes
TLDR
This report motivates the new definition of the paradigm and surveys the reservoir generation/adaptation techniques, offering a natural conceptual classification which transcends boundaries of the current "brand-names" of reservoir methods.
Self-organized Reservoirs and Their Hierarchies
TLDR
Unsupervised greedy bottom-up trained hierarchies of recurrent neural networks and their deep hierarchies are shown being capable of big performance improvements over single layer setups.
Memory in reservoirs for high dimensional input
TLDR
This paper investigates how the internal state of the network retains fading memory of its input signal and finds useful empirical data which expresses how memory in recurrent networks is distributed over the individual principal components of the input.
Echo State Networks with Trained Feedbacks
TLDR
This report explores possible directions in which the theoretical findings could be applied to increase the computational power of Echo State Networks and proposes a modification of ESNs called Layered ESNs.
An experimental unification of reservoir computing methods
Recurrent Kernel Machines: Computing with Infinite Echo State Networks
TLDR
The concept of ESNs is extended to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines.
Temporal-Kernel Recurrent Neural Networks
An overview of reservoir computing: theory, applications and implementations
TLDR
This tutorial will give an overview of current research on theory, applica- tion and implementations of Reservoir Computing, which makes it possible to solve complex tasks using just linear post-processing techniques.
...
...