# Learning dynamical systems from data: a simple cross-validation perspective

@article{Hamzi2020LearningDS, title={Learning dynamical systems from data: a simple cross-validation perspective}, author={Boumediene Hamzi and Houman Owhadi}, journal={ArXiv}, year={2020}, volume={abs/2111.13037} }

Regressing the vector field of a dynamical system from a finite number of observed states is a natural way to learn surrogate models for such systems. We present variants of cross-validation (Kernel Flows [31] and its variants based on Maximum Mean Discrepancy and Lyapunov exponents) as simple approaches for learning the kernel used in these emulators.

## Figures from this paper

## 14 Citations

Learning dynamical systems from data: A simple cross-validation perspective, part I: Parametric kernel flows

- Computer SciencePhysica D: Nonlinear Phenomena
- 2021

Learning to Forecast Dynamical Systems from Streaming Data

- Computer ScienceArXiv
- 2021

This paper proposes a streaming algorithm for KAF that only requires a single pass over the training data, which dramatically reduces the costs of training and prediction without sacrificing forecasting skill.

Gaussian processes meet NeuralODEs: A Bayesian framework for learning the dynamics of partially observed systems from scarce and noisy data

- Computer ScienceArXiv
- 2021

The proposed GP-NODE method takes advantage of recent developments in differentiable programming to propagate gradient information through ordinary differential equation solvers and perform Bayesian inference with respect to unknown model parameters using Hamiltonian Monte Carlo sampling and Gaussian Process priors over the observed system states.

Neural Network Approximations of Compositional Functions With Applications to Dynamical Systems

- Computer ScienceArXiv
- 2020

An algebraic framework and an approximation theory for compositional functions and their neural network approximations are developed and several formulae of error upper bounds for neural networks that approximate the solutions to differential equations, optimization, and optimal control are proved.

Data-driven modelling of nonlinear dynamics by polytope projections and memory

- Mathematics
- 2021

We present a numerical method to model dynamical systems from data. We use the recently introduced method Scalable Probabilistic Approximation (SPA) to project points from a Euclidean space to convex…

Data-driven modelling of nonlinear dynamics by barycentric coordinates and memory

- Mathematics
- 2021

We present a numerical method to model dynamical systems from data. We use the recently introduced method Scalable Probabilistic Approximation (SPA) to project points from a Euclidean space to convex…

Kernel Mode Decomposition and programmable/interpretable regression networks

- Computer ScienceArXiv
- 2019

The proposed framework for programmable and interpretable regression networks for pattern recognition and address mode decomposition as a prototypical problem is introduced and the structure of some of these networks share intriguing similarities with convolutional neural networks while being interpretable, programable and amenable to theoretical analysis.

Predicting the impact of treatments over time with uncertainty aware neural differential equations

- Psychology, Computer ScienceAISTATS
- 2022

This work proposes Counterfactual ODE (CF-ODE), a novel method to predict the impact of treatments continuously over time using Neural Ordinary Differential Equations equipped with uncertainty estimates, demonstrating over several longitudinal data sets that CF-ODE provides more accurate predictions and more reliable uncertainty estimates than previously available methods.

## References

SHOWING 1-10 OF 71 REFERENCES

Learning dynamical systems from data: A simple cross-validation perspective, part I: Parametric kernel flows

- Computer SciencePhysica D: Nonlinear Phenomena
- 2021

Kernel Methods for the Approximation of Nonlinear Systems

- Computer Science, MathematicsSIAM J. Control. Optim.
- 2017

A data-driven order reduction method for nonlinear control systems, drawing on recent progress in machine learning and statistical dimensionality reduction, which leads to a closed, reduced order dynamical system which captures the essential input-output characteristics of the original model.

Balanced reduction of nonlinear control systems in reproducing kernel Hilbert space

- Mathematics, Computer Science2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
- 2010

A novel data-driven order reduction method for nonlinear control systems, drawing on recent progress in machine learning and statistical dimensionality reduction, which leads to a closed, reduced order dynamical system which captures the essential input-output characteristics of the original model.

Attractor reconstruction by machine learning.

- Computer ScienceChaos
- 2018

A theoretical framework is presented that describes conditions under which reservoir computing can create an empirical model capable of skillful short-term forecasts and accurate long-term ergodic behavior and argues that the theory applies to certain other machine learning methods for time series prediction.

Dynamical Modeling with Kernels for Nonlinear Time Series Prediction

- Computer ScienceNIPS
- 2003

This work considers the question of predicting nonlinear time series and proposes Kernel Dynamical Modeling, a new method based on kernels, which shows strong connection with the classic Kalman Filter model, with the kernel feature space as hidden state space.

Simple, low-cost and accurate data-driven geophysical forecasting with learned kernels

- Computer ScienceProceedings of the Royal Society A
- 2021

The proposed approach is general, and the results support the viability of kernel methods (with learned kernels) for interpretable and computationally efficient geophysical forecasting for a large diversity of processes.

Data-driven approximation of the Koopman generator: Model reduction, system identification, and control

- Computer SciencePhysica D: Nonlinear Phenomena
- 2020

Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data.

- Computer Science, GeologyChaos
- 2017

This work uses recent advances in the machine learning area known as "reservoir computing" to formulate a method for model-free estimation from data of the Lyapunov exponents of a chaotic process to form a modified autonomous reservoir.

Operator-theoretic framework for forecasting nonlinear time series with kernel analog techniques

- Mathematics, Computer SciencePhysica D: Nonlinear Phenomena
- 2020