• Corpus ID: 64980759

On self-organizing reservoirs and their hierarchies

@inproceedings{Lukoeviius2010OnSR,
  title={On self-organizing reservoirs and their hierarchies},
  author={Mantas Luko{\vs}evi{\vc}ius},
  year={2010}
}
Current advances in reservoir computing have demonstrated that fixed random recurrent networks with only readouts trained often outperform fully-trained recurrent neural networks. While full supervised training of such networks is problematic, intuitively there should also be something better than a random network. In this contribution we investigate a different approach which is in between the two. We use reservoirs derived from recursive self-organizing maps that are trained in an… 

Figures and Tables from this paper

Reservoir Computing and Self-Organized Neural Hierarchies
TLDR
This thesis overviews existing and investigates new alternatives to the classical supervised training of RNNs and their hierarchies and proposes and investigates the use of two different neural network models for the reservoirs together with several unsupervised adaptation techniques, as well as un supervisedly layer-wise trained deep hierarchies of such models.
Initializing reservoirs with exhibitory and inhibitory signals using unsupervised learning techniques
TLDR
This paper studies the ESQN model initialization using Self-Organizing Maps, and test the model performance initializing the reservoir employing Hebbian rules, and presents an empirical comparison of these reservoir initializations using a range of time series benchmarks.
Technical report on Hierarchical Reservoir Computing architectures
TLDR
An overview of and references to current approaches in hierarchical reservoir computing are given, several of which have been investigated on speech and handwriting recognition problems in the sister EU project ORGANIC (http://reservoir- computing.org/organic).
Deep self-organizing reservoir computing model for visual object recognition
TLDR
A deep self-organizing reservoir computing model for visual object recognition through a stack of well-trained reservoir layers that approaches the state-of-the-art result of 1% among existing traditional machine learning approaches with non-CNN features is proposed.
Self-Organizing Maps and Scale-Invariant Maps in Echo State Networks
TLDR
The primary goal of this work is to improve the performance of ESN using the another method SIM, and the results show the aptitude of SIM and SOM to set the reservoir parameters.
Cluster-based Input Weight Initialization for Echo State Networks
TLDR
This work proposes an unsupervised initialization of the input connections using the K-means algorithm on the training data and shows that for a large variety of datasets, this initialization performs equivalently or superior than a randomly initialized ESN while needing significantly less reservoir neurons.
Visualising Temporal Data Using Reservoir Computing
TLDR
An artificial neural network is created which is a version of echo state machines, ESNs, which is optimal for projecting multivariate time series data onto a low dimensional manifold so that the structure in the time series can be identified by eye.
Factors important for good visualisation of time series
TLDR
This paper reviews work on a minimal architecture echo state machine Wang et al., 2011 in the context of visualisation and shows that it does not perform as well as the original and discusses three factors which may affect the capability of the network - its structure, size and sparsity.
Minimal Echo State Networks for Visualisation
TLDR
This paper investigates a minimal architecture echo state machine in the context of visualisation and shows that it does not perform as well as the original and investigates 3 methods for regaining the power of the standard echo state Machine.
...
...

References

SHOWING 1-10 OF 22 REFERENCES
Reservoir computing approaches to recurrent neural network training
SORN: A Self-Organizing Recurrent Neural Network
TLDR
This work introduces SORN, a self-organizing recurrent network that combines three distinct forms of local plasticity to learn spatio-temporal patterns in its input while maintaining its dynamics in a healthy regime suitable for learning.
An experimental unification of reservoir computing methods
A Recurrent Self-Organizing Map for Temporal Sequence Processing
TLDR
An unsupervised, recurrent neural network based on a self- organizing map that has been applied to the difficult natural language processing problem of position vari- ant recognition, e.g. recognising a noun phrase regardless of its position within a sentence.
Recursive self-organizing maps
A learning algorithm for Recurrent Radial Basis Function Networks
  • M. Mak
  • Computer Science
    Neural Processing Letters
  • 2006
TLDR
A Recurrent Radial Basis Function network that can be applied to temporal pattern classifications and predictions and can approximate the filter more accurate than the Continually Running Fully Recurrent networks trained by the Real-Time Recurrent Learning algorithm.
Learning long-term dependencies with gradient descent is difficult
TLDR
This work shows why gradient based learning algorithms face an increasingly difficult problem as the duration of the dependencies to be captured increases, and exposes a trade-off between efficient learning by gradient descent and latching on information for long periods.
Dynamics and Topographic Organization of Recursive Self-Organizing Maps
TLDR
This work rigorously analyze a generalization of the self-organizing map (SOM) for processing sequential data, recursive SOM(RecSOM), as a nonautonomous dynamical system consisting of a set of fixed input maps, and argues that contractive fixed-input maps are likely to produce Markovian organizations of receptive fields on the RecSOM map.
Recursive self-organizing network models
...
...