• Corpus ID: 181778004

Time Warping Invariant Echo State Networks

@inproceedings{Lukoeviius2006TimeWI,
  title={Time Warping Invariant Echo State Networks},
  author={Mantas Luko{\vs}evi{\vc}ius and Dan Popovici and Herbert Jaeger and Udo Siewert},
  year={2006}
}
Echo State Networks (ESNs) is a recent simple and powerful approach to training recurrent neural networks (RNNs). In this report we present a modification of ESNs - time warping invariant echo state networks (TWIESNs) that can effectively deal with time warping in dynamic pattern recognition. The standard approach to classify time warped input signals is to align them to candidate pro- totype patterns by a dynamic programming method and use the alignment cost as a classification criterion. In… 
Time Series Classification Using Time Warping Invariant Echo State Networks
TLDR
This paper applies time warping invariant Echo State Networks (ESNs) to time-series classification tasks using datasets from various studies in the UCR archive and investigates the influence of ESN architecture and spectral radius of the network in view of general characteristics of data.
Echo State Networks with Trained Feedbacks
TLDR
This report explores possible directions in which the theoretical findings could be applied to increase the computational power of Echo State Networks and proposes a modification of ESNs called Layered ESNs.
Time-Adaptive Recurrent Neural Networks
TLDR
It is demonstrated empirically that these models can effectively compensate for the time-non-uniformity of the data and demonstrate that they compare favorably to data resampling, classical RNN methods, and alternative RNN models proposed to deal with time irregularities on several real-world nonuniform-time datasets.
Echo state networks with double-reservoir for time-series prediction
In this paper, a novel model, named double-reservoir echo state networks (DR-ESN), is proposed. DR-ESN is constructed by two reservoirs which are connected in series, thus the performance of
A new criterion for echo state property of ESN with feedback
TLDR
A new criterion for echo state property with output feedback is proposed for ESN, which can improve the degree of influence of the previous inputs and the current state, and reduce the vanished speed of the effect.
Parameters relation of the Leaky integrator echo state network for time series
TLDR
This paper studies the relationship between the size, sparsity of the reservoir and the modeling accuracy through the simulation experiment, and gives the qualitative description of relationship between them and the quantitative description by the least squares fitting method.
...
...

References

SHOWING 1-6 OF 6 REFERENCES
Time Warping Invariant Neural Networks
TLDR
Analysis has shown that TWINN completely removes time warping and is able to handle difficult classification problem, and has certain advantages over the current available sequential processing schemes.
Time-Warping Network: A Hybrid Framework for Speech Recognition
TLDR
A time-warping neuron is defined that extends the operation of the formal neuron of a back-propagation network by warping the input pattern to match it optimally to its weights.
Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
TLDR
A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.
Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication
We present a method for learning nonlinear systems, echo state networks (ESNs). ESNs employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning
Minimum prediction residual principle applied to speech recognition
TLDR
A computer system is described in which isolated words, spoken by a designated talker, are recognized through calculation of a minimum prediction residual through optimally registering the reference LPC onto the input autocorrelation coefficients using the dynamic programming algorithm.
A Tutorial on Hidden Markov Models and Selected Applications
The fabric comprises a novel type of netting which will have particular utility in screening out mosquitoes and like insects and pests. The fabric is defined of voids having depth as well as width