Learning dynamical systems from data: a simple cross-validation perspective

@article{Hamzi2020LearningDS,
  title={Learning dynamical systems from data: a simple cross-validation perspective},
  author={Boumediene Hamzi and Houman Owhadi},
  journal={ArXiv},
  year={2020},
  volume={abs/2111.13037}
}
Regressing the vector field of a dynamical system from a finite number of observed states is a natural way to learn surrogate models for such systems. We present variants of cross-validation (Kernel Flows [31] and its variants based on Maximum Mean Discrepancy and Lyapunov exponents) as simple approaches for learning the kernel used in these emulators. 

Figures from this paper

Learning dynamical systems from data: A simple cross-validation perspective, part I: Parametric kernel flows
Learning to Forecast Dynamical Systems from Streaming Data
TLDR
This paper proposes a streaming algorithm for KAF that only requires a single pass over the training data, which dramatically reduces the costs of training and prediction without sacrificing forecasting skill.
Gaussian processes meet NeuralODEs: A Bayesian framework for learning the dynamics of partially observed systems from scarce and noisy data
TLDR
The proposed GP-NODE method takes advantage of recent developments in differentiable programming to propagate gradient information through ordinary differential equation solvers and perform Bayesian inference with respect to unknown model parameters using Hamiltonian Monte Carlo sampling and Gaussian Process priors over the observed system states.
Neural Network Approximations of Compositional Functions With Applications to Dynamical Systems
TLDR
An algebraic framework and an approximation theory for compositional functions and their neural network approximations are developed and several formulae of error upper bounds for neural networks that approximate the solutions to differential equations, optimization, and optimal control are proved.
Data-driven modelling of nonlinear dynamics by polytope projections and memory
We present a numerical method to model dynamical systems from data. We use the recently introduced method Scalable Probabilistic Approximation (SPA) to project points from a Euclidean space to convex
Data-driven modelling of nonlinear dynamics by barycentric coordinates and memory
We present a numerical method to model dynamical systems from data. We use the recently introduced method Scalable Probabilistic Approximation (SPA) to project points from a Euclidean space to convex
Kernel Mode Decomposition and programmable/interpretable regression networks
TLDR
The proposed framework for programmable and interpretable regression networks for pattern recognition and address mode decomposition as a prototypical problem is introduced and the structure of some of these networks share intriguing similarities with convolutional neural networks while being interpretable, programable and amenable to theoretical analysis.
Predicting the impact of treatments over time with uncertainty aware neural differential equations
TLDR
This work proposes Counterfactual ODE (CF-ODE), a novel method to predict the impact of treatments continuously over time using Neural Ordinary Differential Equations equipped with uncertainty estimates, demonstrating over several longitudinal data sets that CF-ODE provides more accurate predictions and more reliable uncertainty estimates than previously available methods.
...
1
2
...

References

SHOWING 1-10 OF 71 REFERENCES
Learning dynamical systems from data: A simple cross-validation perspective, part I: Parametric kernel flows
Kernel Methods for the Approximation of Nonlinear Systems
TLDR
A data-driven order reduction method for nonlinear control systems, drawing on recent progress in machine learning and statistical dimensionality reduction, which leads to a closed, reduced order dynamical system which captures the essential input-output characteristics of the original model.
Balanced reduction of nonlinear control systems in reproducing kernel Hilbert space
  • J. Bouvrie, B. Hamzi
  • Mathematics, Computer Science
    2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2010
TLDR
A novel data-driven order reduction method for nonlinear control systems, drawing on recent progress in machine learning and statistical dimensionality reduction, which leads to a closed, reduced order dynamical system which captures the essential input-output characteristics of the original model.
Attractor reconstruction by machine learning.
TLDR
A theoretical framework is presented that describes conditions under which reservoir computing can create an empirical model capable of skillful short-term forecasts and accurate long-term ergodic behavior and argues that the theory applies to certain other machine learning methods for time series prediction.
Dynamical Modeling with Kernels for Nonlinear Time Series Prediction
TLDR
This work considers the question of predicting nonlinear time series and proposes Kernel Dynamical Modeling, a new method based on kernels, which shows strong connection with the classic Kalman Filter model, with the kernel feature space as hidden state space.
Nonlinear prediction of chaotic time series
Simple, low-cost and accurate data-driven geophysical forecasting with learned kernels
TLDR
The proposed approach is general, and the results support the viability of kernel methods (with learned kernels) for interpretable and computationally efficient geophysical forecasting for a large diversity of processes.
Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data.
TLDR
This work uses recent advances in the machine learning area known as "reservoir computing" to formulate a method for model-free estimation from data of the Lyapunov exponents of a chaotic process to form a modified autonomous reservoir.
Operator-theoretic framework for forecasting nonlinear time series with kernel analog techniques
...
1
2
3
4
5
...