DeepMoD: Deep learning for model discovery in noisy data

@article{Both2021DeepMoDDL,
  title={DeepMoD: Deep learning for model discovery in noisy data},
  author={Gert-Jan Both and Subham Choudhury and Pierre Sens and Remy Kusters},
  journal={J. Comput. Phys.},
  year={2021},
  volume={428},
  pages={109985}
}

Figures from this paper

Bayesian Deep Learning for Partial Differential Equation Parameter Discovery with Sparse and Noisy Data
TLDR
This paper proposes to use Bayesian neural networks (BNN) in order to recover the full system states from measurement data, and uses Hamiltonian Monte-Carlo to sample the posterior distribution of a deep and dense BNN, showing that it is possible to accurately capture physics of varying complexity, without overfitting.
PDE-READ: Human-readable Partial Differential Equation Discovery using Deep Learning
TLDR
This work introduces a new approach for PDE discovery that uses two Rational Neural Networks and a principled sparse regression algorithm to identify the hidden dynamics that govern a system’s response.
Fully differentiable model discovery
TLDR
This paper starts by reinterpreting PINNs as multitask models, applying multitask learning using uncertainty, and shows that this leads to a natural framework for including Bayesian regression techniques, and builds a robust model discovery algorithm by using SBL.
Sparsistent Model Discovery
TLDR
It is shown that the adaptive Lasso will have more chances of verifying the IRC than the Lasso and it is proposed to integrate it within a deep learning model discovery framework with stability selection and error control.
Deep learning of physical laws from scarce data
TLDR
This work introduces a novel physics-informed deep learning framework to discover governing partial differential equations (PDEs) from scarce and noisy data for nonlinear spatiotemporal systems and shows the potential for closed-form model discovery in practical applications where large and accurate datasets are intractable to capture.
Learning Dynamics from Noisy Measurements using Deep Learning with a Runge-Kutta Constraint
TLDR
The proposed approach provides a promising methodology to learn dynamic models, where the first-principle understanding remains opaque, by learning a neural network that implicitly represents the data and an additional network that models the vector fields of the dependent variables.
Physics-informed learning of governing equations from scarce data
TLDR
The efficacy and robustness of this method are demonstrated, both numerically and experimentally, on discovering a variety of partial differential equation systems with different levels of data scarcity and noise accounting for different initial/boundary conditions.
Discovering PDEs from Multiple Experiments
TLDR
A randomised adaptive group Lasso sparsity estimator is introduced to promote grouped sparsity and implement it in a deep learning based PDE discovery framework to create a learning bias that implies the a priori assumption that all experiments can be explained by the same underlying PDE terms with potentially different coefficients.
Robust discovery of partial differential equations in complex situations
TLDR
Results prove that the proposed R-DLGA is able to calculate derivatives accurately with the optimization of PINN and possesses surprising robustness to complex situations, including sparse data with high noise, high-order derivatives, and shock waves.
Automatic differentiation to simultaneously identify nonlinear dynamics and extract noise probability distributions from data
TLDR
A variant of the SINDy algorithm that integrates automatic differentiation and recent time-stepping constrained motivated by Rudy et al is developed, which can learn a diversity of probability distributions for the measurement noise, including Gaussian, uniform, Gamma, and Rayleigh distributions.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 16 REFERENCES
Physics Informed Deep Learning (Part II): Data-driven Discovery of Nonlinear Partial Differential Equations
We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial
Stability selection enables robust learning of partial differential equations from limited noisy data
TLDR
This work proposes a stability-based model selection scheme to determine the level of regularization required for reproducible recovery of the underlying PDE, and shows that in particular the combination of stability selection with the iterative hard-thresholding algorithm from compressed sensing provides a fast, parameter-free, and robust computational framework for PDE inference that outperforms previous algorithmic approaches with respect to recovery accuracy, amount of data required and robustness to noise.
Data-driven discovery of partial differential equations
TLDR
The sparse regression method is computationally efficient, robust, and demonstrated to work on a variety of canonical problems spanning a number of scientific domains including Navier-Stokes, the quantum harmonic oscillator, and the diffusion equation.
Deep learning and process understanding for data-driven Earth system science
TLDR
It is argued that contextual cues should be used as part of deep learning to gain further process understanding of Earth system science problems, improving the predictive ability of seasonal forecasting and modelling of long-range spatial connections across multiple timescales.
A neural network approach for the blind deconvolution of turbulent flows
TLDR
The proposed blind deconvolution network performs exceptionally well in the a priori testing of two-dimensional Kraichnan, three-dimensional Kolmogorov and compressible stratified turbulence test cases, and shows promise in forming the backbone of a physics-augmented data-driven closure for the Navier–Stokes equations.
Inferring Biological Networks by Sparse Identification of Nonlinear Dynamics
TLDR
This method, implicit-SINDy, succeeds in inferring three canonical biological models: 1) Michaelis-Menten enzyme kinetics; 2) the regulatory network for competence in bacteria; and 3) the metabolic network for yeast glycolysis.
InferenceMAP: mapping of single-molecule dynamics with Bayesian inference
TLDR
InferenceMAP is an interactive software tool that uses a powerful Bayesian technique to extract the parameters that describe the motion of individual molecules from single-molecule trajectories and presents relevant applications inside lipid rafts, glycine receptors, and HIV assembly platforms.
Theory-Guided Machine Learning in Materials Science
TLDR
This work outlines potential pitfalls involved in using machine learning without robust protocols and shows how proceeding without the guidance of domain knowledge can lead to both quantitatively and qualitatively incorrect predictive models.
Image Super-Resolution Via Sparse Representation
TLDR
This paper presents a new approach to single-image superresolution, based upon sparse signal representation, which generates high-resolution images that are competitive or even superior in quality to images produced by other similar SR methods.
...
1
2
...