Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations

@article{Raissi2018NumericalGP,
  title={Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations},
  author={Maziar Raissi and Paris Perdikaris and George Em Karniadakis},
  journal={SIAM J. Sci. Comput.},
  year={2018},
  volume={40}
}
We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of time-dependent partial differential equations. Numerical Gaussian processes, by construction, are designed to deal with cases where (a) all we observe are noisy data on black-box initial conditions, and (b) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent partial… 
Machine learning of linear differential equations using Gaussian processes
Bayesian Numerical Methods for Nonlinear Partial Differential Equations
TLDR
Proof-of-concept experimental results demonstrate that meaningful probabilistic uncertainty quantification for the unknown solution of the PDE can be performed, while controlling the number of times the right-handside, initial and boundary conditions are evaluated.
Numerical Gaussian process Kalman filtering
Physics-Information-Aided Kriging: Constructing Covariance Functions using Stochastic Simulation Models
TLDR
This work proposes a new Gaussian process regression (GPR) method: physics information aided Kriging (PhIK), and proves that the physical constraints in the form of a deterministic linear operator are guaranteed in the resulting prediction.
Data-driven discovery of partial differential equation models with latent variables
TLDR
It is shown that local polynomial interpolation combined with sparse regression can handle data on spatiotemporal grids that are representative of typical experimental measurement techniques such as particle image velocimetry, but it is found that the reconstructed model is sensitive to measurement noise and trace this sensitivity to the presence of high-order spatial and/or temporal derivatives.
Coarse-grained and Emergent Distributed Parameter Systems from Data
TLDR
This work explores the derivation of distributed parameter system evolution laws (and in particular, partial differential operators and associated partial differential equations, PDEs) from spatiotemporal data through the use of manifold learning techniques in conjunction with neural network learning algorithms.
Physics-Informed Kriging: A Physics-Informed Gaussian Process Regression Method for Data-Model Convergence
TLDR
The efficiency and accuracy of PhIK are demonstrated for reconstructing a partially known modified Branin function and learning a conservative tracer distribution from sparse concentration measurements and an active learning algorithm that guides the selection of additional observation locations.
Machine Learning of Space-Fractional Differential Equations
TLDR
This work provides a user-friendly and feasible way to perform fractional derivatives of kernels, via a unified set of d-dimensional Fourier integral formulas amenable to generalized Gauss-Laguerre quadrature.
Generative Stochastic Modeling of Strongly Nonlinear Flows with Non-Gaussian Statistics
TLDR
A data-driven framework to model stationary chaotic dynamical systems through nonlinear transformations and a set of decoupled stochastic differential equations (SDEs) is proposed, suggesting that this class of models provide an efficient hypothesis space for learning strongly nonlinear flows from small amounts of data.
Stochastic Processes Under Linear Differential Constraints : Application to Gaussian Process Regression for the 3 Dimensional Free Space Wave Equation
Let P be a linear differential operator over D Ă R and U “ pUxqxPD a second order stochastic process. In the first part of this article, we prove a new necessary and sufficient condition for all the
...
...

References

SHOWING 1-10 OF 50 REFERENCES
Machine learning of linear differential equations using Gaussian processes
Inferring solutions of differential equations using noisy multi-fidelity data
Statistical analysis of differential equations: introducing probability measures on numerical solutions
TLDR
It is shown that a wide variety of existing solvers can be randomised, inducing a probability measure over the solutions of ordinary and partial differential equation models, and the formal means to incorporate this uncertainty in a statistical model and its subsequent analysis are provided.
Kalman filtering and smoothing solutions to temporal Gaussian process regression models
TLDR
This paper shows how temporal Gaussian process regression models in machine learning can be reformulated as linear-Gaussian state space models, which can be solved exactly with classical Kalman filtering theory, and produces an efficient non-parametric learning algorithm.
Sparse Gaussian Processes using Pseudo-inputs
TLDR
It is shown that this new Gaussian process (GP) regression model can match full GP performance with small M, i.e. very sparse solutions, and it significantly outperforms other approaches in this regime.
Probabilistic ODE Solvers with Runge-Kutta Means
TLDR
A family of probabilistic numerical methods that instead return a Gauss-Markov process defining a probability distribution over the ODE solution, such that posterior means match the outputs of the Runge-Kutta family exactly, thus inheriting their proven good properties.
Parametric Gaussian process regression for big data
  • M. Raissi
  • Computer Science
    Computational Mechanics
  • 2019
This work introduces the concept of parametric Gaussian processes (PGP), which is built upon the seemingly self-contradictory idea of making Gaussian processes parametric. The resulting framework is
Probabilistic numerics and uncertainty in computations
TLDR
It is shown that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance.
Gaussian Processes for Machine Learning
TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Brittleness of Bayesian Inference Under Finite Information in a Continuous World
TLDR
It is observed that learning and robustness are antagonistic properties, and optimal lower and upper bounds on posterior values obtained from Bayesian models that exactly capture an arbitrarily large number of finite-dimensional marginals of the data-generating distribution are derived.
...
...