• Corpus ID: 196621796

Dynamical Systems as Temporal Feature Spaces

@article{Tio2019DynamicalSA,
  title={Dynamical Systems as Temporal Feature Spaces},
  author={Peter Tiňo},
  journal={J. Mach. Learn. Res.},
  year={2019},
  volume={21},
  pages={44:1-44:42}
}
  • P. Tiňo
  • Published 15 July 2019
  • Computer Science
  • J. Mach. Learn. Res.
Parameterized state space models in the form of recurrent networks are often used in machine learning to learn from data streams exhibiting temporal dependencies. To break the black box nature of such models it is important to understand the dynamical features of the input driving time series that are formed in the state space. We propose a framework for rigorous analysis of such state representations in vanishing memory state space models such as echo state networks (ESN). In particular, we… 

Understanding Recurrent Neural Networks Using Nonequilibrium Response Theory

  • S. H. Lim
  • Computer Science
    J. Mach. Learn. Res.
  • 2021
This work derives a Volterra type series representation for a class of continuous-time stochastic RNNs (SRNNs) driven by an input signal and shows that the SRNNs can be viewed as kernel machines operating on a reproducing kernel Hilbert space associated with the response feature.

Input-to-State Representation in Linear Reservoirs Dynamics

A novel analysis of the dynamics of recurrent neural networks is proposed, which allows the investigator to express the state evolution using the controllability matrix and explains why a cyclic topology achieves favorable results.

Hybrid Backpropagation Parallel Reservoir Networks

A novel hybrid network which combines the effectiveness of learning random temporal features of reservoirs with the readout power of a deep neural network with batch normalization is proposed, which outperforms LSTMs and GRUs, including multi-layer "deep" versions of these networks, on two complex real-world multi-dimensional time series datasets.

Input representation in recurrent neural networks dynamics

A novel analysis of the dynamics of recurrent neural networks is proposed, which allows one to express the state evolution using the controllability matrix and it is possible to compare different architectures and explain why a cyclic topology achieves favourable results.

Stacked Residuals of Dynamic Layers for Time Series Anomaly Detection

An end-to-end differentiable neural network architecture to perform anomaly detection in multivariate time series by incorporating a Sequential Probability Ratio Test on the prediction residual, which outperforms both state-of-the-art robust statistical methods and deep neural network architectures on multiple anomaly detection benchmarks.

Memory and forecasting capacities of nonlinear recurrent networks

Deep Randomized Neural Networks

This chapter surveys all the major aspects regarding the design and analysis of Randomized Neural Networks, and some of the key results with respect to their approximation capabilities.

Predicting critical transitions in multiscale dynamical systems using reservoir computing.

This work presents a data-driven method to predict the future evolution of the state of a class of slow-fast nonlinear dynamical systems and shows that it is capable of predicting a critical transition event at least several numerical time steps in advance.

Euler State Networks

Experiments on synthetic tasks indicate the marked superiority of the proposed approach, compared to standard RC models, in tasks requiring longterm memorization skills, and results on real-world time series classification benchmarks point out that EuSN is capable of matching (or even surpassing) the level of accuracy of trainable Recurrent Neural Networks.

Parametric Validation of the Reservoir Computing-Based Machine Learning Algorithm Applied to Lorenz System Reconstructed Dynamics

A detailed parametric analysis is presented, where it is observed that the prediction capabilities of the reservoir computing approach strongly depend on the random initialization of both the input and the reservoir layers.

References

SHOWING 1-10 OF 59 REFERENCES

Memory traces in dynamical systems

The Fisher Memory Curve is introduced as a measure of the signal-to-noise ratio (SNR) embedded in the dynamical state relative to the input SNR and it is illustrated the generality of the theory by showing that memory in fluid systems can be sustained by transient nonnormal amplification due to convective instability or the onset of turbulence.

Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks

A new definition of the echo state property is presented that directly connects it to temporal or statistical properties of the driving input and derives a fundamental 0-1 law: if the input comes from an ergodic source, the network response has the echoState property with probability one or zero, independent of the given network.

Nonlinear System Modeling With Random Matrices: Echo State Networks Revisited

It is shown that the state transition mapping is contractive with high probability when only the necessary condition is satisfied, which corroborates and thus analytically explains the observation that in practice one obtains echo states when the spectral radius of the reservoir weight matrix is smaller than 1.

Minimum Complexity Echo State Network

It is shown that a simple deterministically constructed cycle reservoir is comparable to the standard echo state network methodology and the (short-term) of linear cyclic reservoirs can be made arbitrarily close to the proved optimal value.

Short-term memory in neuronal networks through dynamical compressed sensing

This work exploits techniques from the statistical physics of disordered systems to analytically compute the decay of memory traces in such networks as a function of network size, signal sparsity and integration time.

Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.

Asymptotic Fisher memory of randomized linear symmetric Echo State Networks

Information Processing Capacity of Dynamical Systems

The theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis to define the computational capacity of a dynamical system.

Modeling reward functions for incomplete state representations via echo state networks

  • K. BushC. Anderson
  • Computer Science
    Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.
  • 2005
This research demonstrates that the ESN architecture represents the Q-function of the MSD system given incomplete state information as well as current feed forward neural networks given either perfect state or a temporally-windowed, incomplete state vector.
...