Deep learning of dynamics and signal-noise decomposition with time-stepping constraints

@article{Rudy2019DeepLO,
  title={Deep learning of dynamics and signal-noise decomposition with time-stepping constraints},
  author={Samuel H. Rudy and J. Nathan Kutz and Steven L. Brunton},
  journal={J. Comput. Phys.},
  year={2019},
  volume={396},
  pages={483-506}
}

Neural ODEs with Irregular and Noisy Data

TLDR
The proposed framework to learn a model describing the vector field is highly effective under noisy measurements and can handle scenarios where dependent variables are not available at the same temporal grid.

The Discovery of Dynamics via Linear Multistep Methods and Deep Learning: Error Estimation

TLDR
This work considers the deep network-based LMMs for the discovery of dynamics using the approximation property of deep networks, and indicates, for certain families of LMMs, that the l grid error is bounded by the sum of O(h) and the network approximation error.

Learning Fine Scale Dynamics from Coarse Observations via Inner Recurrence

  • V. ChurchillD. Xiu
  • Computer Science
    Journal of Machine Learning for Modeling and Computing
  • 2022
TLDR
This paper presents a computational technique to learn thene-scale dynamics from such coarsely observed data and employs inner recurrence of a DNN to recover the ne-scale evolution operator of the underlying system.

Active operator inference for learning low-dimensional dynamical-system models from noisy data

TLDR
This work builds on operator inference from scientific machine learning to infer low-dimensional models from high-dimensional state trajectories polluted with noise and shows that, under certain conditions, the inferred operators are unbiased estimators of the well-studied projection-based reduced operators from traditional model reduction.

Neural Dynamical Systems: Balancing Structure and Flexibility in Physical Prediction

We introduce Neural Dynamical Systems (NDS), a method of learning dynamical models in various gray-box settings which incorporates prior knowledge in the form of systems of ordinary differential

LQResNet: A Deep Neural Network Architecture for Learning Dynamic Processes

TLDR
This work suggests combining the operator inference with certain deep neural network approaches to infer the unknown nonlinear dynamics of the system and demonstrates that the proposed methodology accomplishes the desired tasks for dynamics processes encountered in neural dynamics and the glycolytic oscillator.

Discovery of nonlinear dynamical systems using a Runge–Kutta inspired dictionary-based sparse regression approach

TLDR
This work combines machine learning and dictionary-based learning with numerical analysis tools to discover differential equations from noisy and sparsely sampled measurement data of time-dependent processes, and extends the method to governing equations, containing rational nonlinearities that typically appear in biological networks.

Learning Low-Dimensional Quadratic-Embeddings of High-Fidelity Nonlinear Dynamics using Deep Learning

TLDR
This work leverages deep learning to identify low-dimensional quadratic embeddings for high-fidelity dynamical systems and embeds a Runge-Kutta method to avoid the time-derivative computations.

Deep subspace encoders for continuous-time state-space identification

TLDR
It is proved that the existence of the encoder function has the necessary condition of a Lipschitz continuous state-derivative utilizing established properties of ODEs, and the novel estimation method called the subspace encoder approach is presented.

Automatic differentiation to simultaneously identify nonlinear dynamics and extract noise probability distributions from data

TLDR
A variant of the SINDy algorithm that integrates automatic differentiation and recent time-stepping constrained motivated by Rudy et al is developed, which can learn a diversity of probability distributions for the measurement noise, including Gaussian, uniform, Gamma, and Rayleigh distributions.
...

References

SHOWING 1-10 OF 76 REFERENCES

Extracting structured dynamical systems using sparse optimization with very few samples

TLDR
A random sampling method for learning structured dynamical systems from under-sampled and possibly noisy state-space measurements based on a Bernstein-like inequality for partly dependent random variables and theoretical guarantees on the recovery rate of the sparse coefficients and the identification of the candidate functions for the corresponding problem.

Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations

  • M. Raissi
  • Computer Science
    J. Mach. Learn. Res.
  • 2018
TLDR
This work puts forth a deep learning approach for discovering nonlinear partial differential equations from scattered and potentially noisy observations in space and time by approximate the unknown solution as well as the nonlinear dynamics by two deep neural networks.

Deep learning for universal linear embeddings of nonlinear dynamics

TLDR
It is often advantageous to transform a strongly nonlinear system into a linear one in order to simplify its analysis for prediction and control, so the authors combine dynamical systems with deep learning to identify these hard-to-find transformations.

Extracting Sparse High-Dimensional Dynamics from Limited Data

TLDR
This work details three sampling strategies that lead to the exact recovery of first-order dynamical systems when the authors are given fewer samples than unknowns, which is stable to the sparse structure of the governing equations and robust to noise in the estimation of the velocity.

Hidden physics models: Machine learning of nonlinear partial differential equations

Sparse reduced-order modelling: sensor-based dynamics to full-state estimation

TLDR
A grey-box modelling strategy is successfully applied to the transient and post-transient laminar cylinder wake, and compares favourably with a proper orthogonal decomposition model.

Physics Informed Deep Learning (Part I): Data-driven Solutions of Nonlinear Partial Differential Equations

TLDR
This two part treatise introduces physics informed neural networks – neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations and demonstrates how these networks can be used to infer solutions topartial differential equations, and obtain physics-informed surrogate models that are fully differentiable with respect to all input coordinates and free parameters.

Constrained sparse Galerkin regression

TLDR
The sparse identification of nonlinear dynamics (SINDy) is a recently proposed data-driven modelling framework that uses sparse regression techniques to identify nonlinear low-order models and extends it to enforce physical constraints in the regression, e.g. energy-preserving quadratic nonlinearities.

Data-Driven Discovery of Closure Models

TLDR
This work presents a framework of operator inference to extract the governing dynamics of closure from data in a compact, non-Markovian form and examines observability of the closure in terms of the resolved dynamics.

Dynamic mode decomposition - data-driven modeling of complex systems

TLDR
This first book to address the DMD algorithm presents a pedagogical and comprehensive approach to all aspects of DMD currently developed or under development, and blends theoretical development, example codes, and applications to showcase the theory and its many innovations and uses.
...