Chaos on compact manifolds: Differentiable synchronizations beyond the Takens theorem.
@article{Grigoryeva2020ChaosOC, title={Chaos on compact manifolds: Differentiable synchronizations beyond the Takens theorem.}, author={Lyudmila Grigoryeva and Allen G. Hart and Juan-Pablo Ortega}, journal={Physical review. E}, year={2020}, volume={103 6-1}, pages={ 062204 } }
This paper shows that a large class of fading memory state-space systems driven by discrete-time observations of dynamical systems defined on compact manifolds always yields continuously differentiable synchronizations. This general result provides a powerful tool for the representation, reconstruction, and forecasting of chaotic attractors. It also improves previous statements in the literature for differentiable generalized synchronizations, whose existence was so far guaranteed for a…
7 Citations
Reservoir kernels and Volterra series
- MathematicsArXiv
- 2022
—A universal kernel is constructed whose sections approximate any causal and time-invariant filter in the fading memory category with inputs and outputs in a finite-dimensional Euclidean space. This…
Generalised Synchronisations, Embeddings, and Approximations for Continuous Time Reservoir Computers
- Mathematics, Computer ScienceSSRN Electronic Journal
- 2022
It is shown that if the observations are perturbed by white noise, the GS is preserved up to a perturbation by an Ornstein-Uhlenbeck process.
Matrix cocycles in the learning of dynamical systems
- Computer Science
- 2022
A mathematical framework in which the information is in the form of an embedding provides the platform for two other investigations of the reconstructed system - its dynamical stability; and the growth of error under iterations.
Learning theory for dynamical systems
- Computer Science
- 2022
This work presents a mathematical framework in which the dynamical information is represented in the form of an embedding, which bridges the gap between universally observed behavior of dynamics modelling; and the spectral, differential and ergodic properties intrinsic to the dynamics.
Dynamics and Information Import in Recurrent Neural Networks
- Computer ScienceFrontiers in Computational Neuroscience
- 2022
A completely new type of resonance phenomenon is found, which is called “Import Resonance” (IR), where the information import shows a maximum, i.e., a peak-like dependence on the coupling strength between the RNN and its external input.
A Systematic Exploration of Reservoir Computing for Forecasting Complex Spatiotemporal Dynamics
- Computer ScienceNeural Networks
- 2022
Dimension reduction in recurrent networks by canonicalization
- Computer Science, MathematicsJournal of Geometric Mechanics
- 2021
The classical notion of canonical state-space realization is adapted in this paper to accommodate semi-infinite inputs so that it can be used as a dimension reduction tool in the recurrent networks setup.
References
SHOWING 1-10 OF 50 REFERENCES
Embedding Nonlinear Dynamical Systems: A Guide to Takens' Theorem
- Mathematics
- 2006
The embedding theorem forms a bridge between the theory of nonlinear dynamical systems and the analysis of experimental time series. This memorandum describes the theorem and gives a detailed account…
Stability and memory-loss go hand-in-hand: three results in dynamics and computation
- Computer ScienceProceedings of the Royal Society A
- 2020
Surprisingly, memory-loss a feature of driven systems to forget their internal states helps provide unambiguous answers to fundamental stability questions that have been unanswered for decades.
Learn to Synchronize, Synchronize to Learn
- Computer ScienceChaos
- 2021
This work studies the properties behind learning dynamical systems with RC and proposes a new guiding principle based on Generalized Synchronization (GS) granting its feasibility and shows that the well-known Echo State Property implies and is implied by GS, so that theoretical results derived from the ESP still hold when GS does.
Transfer learning for nonlinear dynamics and its application to fluid turbulence.
- PhysicsPhysical review. E
- 2020
A surprisingly small amount of learning is enough to infer the energy dissipation rate of the Navier-Stokes turbulence because it can, thanks to the small-scale universality of turbulence, transfer a large amount of the knowledge learned from turbulence data at lower Reynolds number.
Invertible generalized synchronization: A putative mechanism for implicit learning in neural systems.
- BiologyChaos
- 2020
A general and biologically feasible learning framework that utilizes invertible generalized synchronization (IGS), which supports the notion that biological neural networks can learn the dynamic nature of their environment through the mechanism of IGS.
Echo State Networks trained by Tikhonov least squares are L2(μ) approximators of ergodic dynamical systems
- MathematicsPhysica D: Nonlinear Phenomena
- 2021
Risk bounds for reservoir computing
- Computer ScienceArXiv
- 2019
Finite sample upper bounds are derived for the generalization error committed by specific families of reservoir computing systems when processing discrete-time inputs under various hypotheses on their dependence structure in the framework of statistical learning theory.
The reservoir's perspective on generalized synchronization.
- MathematicsChaos
- 2019
The reservoir model reproduces different levels of consistency where there is no synchronization, and extracts signatures of the maximal conditional Lyapunov exponent in the performance of variations of the reservoir topology.