Corpus ID: 237635249

Discovering PDEs from Multiple Experiments

  title={Discovering PDEs from Multiple Experiments},
  author={Georges Tod and Gert-Jan Both and Remy Kusters},
Automated model discovery of partial differential equations (PDEs) usually considers a single experiment or dataset to infer the underlying governing equations. In practice, experiments have inherent natural variability in parameters, initial and boundary conditions that cannot be simply averaged out. We introduce a randomised adaptive group Lasso sparsity estimator to promote grouped sparsity and implement it in a deep learning based PDE discovery framework1. It allows to create a learning… Expand

Figures from this paper


Sparsistent Model Discovery
It is shown that the adaptive Lasso will have more chances of verifying the IRC than the Lasso and it is proposed to integrate it within a deep learning model discovery framework with stability selection and error control. Expand
Data-Driven Identification of Parametric Partial Differential Equations
It is shown that group sequentially thresholded ridge regression outperforms group LASSO in identifying the fewest terms in the PDE along with their parametric dependency, and the method is demonstrated on four canonical models with and without the introduction of noise. Expand
Learning partial differential equations via data discovery and sparse optimization
  • H. Schaeffer
  • Mathematics, Medicine
  • Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2017
This work develops a learning algorithm to identify the terms in the underlying partial differential equations and to approximate the coefficients of the terms only using data, which uses sparse optimization in order to perform feature selection and parameter estimation. Expand
Data-driven discovery of partial differential equations
The sparse regression method is computationally efficient, robust, and demonstrated to work on a variety of canonical problems spanning a number of scientific domains including Navier-Stokes, the quantum harmonic oscillator, and the diffusion equation. Expand
Stability selection enables robust learning of partial differential equations from limited noisy data
This work proposes a stability-based model selection scheme to determine the level of regularization required for reproducible recovery of the underlying PDE, and shows that in particular the combination of stability selection with the iterative hard-thresholding algorithm from compressed sensing provides a fast, parameter-free, and robust computational framework for PDE inference that outperforms previous algorithmic approaches with respect to recovery accuracy, amount of data required and robustness to noise. Expand
Deep learning of physical laws from scarce data
This work introduces a novel physics-informed deep learning framework to discover governing partial differential equations (PDEs) from scarce and noisy data for nonlinear spatiotemporal systems and shows the potential for closed-form model discovery in practical applications where large and accurate datasets are intractable to capture. Expand
DeepMoD: Deep learning for model discovery in noisy data
We introduce DeepMoD, a Deep learning based Model Discovery algorithm. DeepMoD discovers the partial differential equation underlying a spatio-temporal data set using sparse regression on a libraryExpand
Discovering governing equations from data by sparse identification of nonlinear dynamical systems
This work develops a novel framework to discover governing equations underlying a dynamical system simply from data measurements, leveraging advances in sparsity techniques and machine learning and using sparse regression to determine the fewest terms in the dynamic governing equations required to accurately represent the data. Expand
Discovery of Physics From Data: Universal Laws and Discrepancies
It is shown that measurement noise and complex secondary physical mechanisms, like unsteady fluid drag forces, can obscure the underlying law of gravitation, leading to an erroneous model. Expand
Implicit Neural Representations with Periodic Activation Functions
This work proposes to leverage periodic activation functions for implicit neural representations and demonstrates that these networks, dubbed sinusoidal representation networks or Sirens, are ideally suited for representing complex natural signals and their derivatives. Expand