# Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations

@article{Raissi2018NumericalGP, title={Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations}, author={Maziar Raissi and Paris Perdikaris and George Em Karniadakis}, journal={SIAM J. Sci. Comput.}, year={2018}, volume={40} }

We introduce the concept of numerical Gaussian processes, which we define as Gaussian processes with covariance functions resulting from temporal discretization of time-dependent partial differential equations. Numerical Gaussian processes, by construction, are designed to deal with cases where (a) all we observe are noisy data on black-box initial conditions, and (b) we are interested in quantifying the uncertainty associated with such noisy data in our solutions to time-dependent partial…

## Figures and Tables from this paper

## 169 Citations

Machine learning of linear differential equations using Gaussian processes

- Mathematics, Computer ScienceJ. Comput. Phys.
- 2017

Bayesian Numerical Methods for Nonlinear Partial Differential Equations

- MathematicsStat. Comput.
- 2021

Proof-of-concept experimental results demonstrate that meaningful probabilistic uncertainty quantification for the unknown solution of the PDE can be performed, while controlling the number of times the right-handside, initial and boundary conditions are evaluated.

Numerical Gaussian process Kalman filtering

- Computer ScienceIFAC-PapersOnLine
- 2020

Physics-Information-Aided Kriging: Constructing Covariance Functions using Stochastic Simulation Models

- Computer Science
- 2018

This work proposes a new Gaussian process regression (GPR) method: physics information aided Kriging (PhIK), and proves that the physical constraints in the form of a deterministic linear operator are guaranteed in the resulting prediction.

Data-driven discovery of partial differential equation models with latent variables

- MathematicsPhysical review. E
- 2019

It is shown that local polynomial interpolation combined with sparse regression can handle data on spatiotemporal grids that are representative of typical experimental measurement techniques such as particle image velocimetry, but it is found that the reconstructed model is sensitive to measurement noise and trace this sensitivity to the presence of high-order spatial and/or temporal derivatives.

Coarse-grained and Emergent Distributed Parameter Systems from Data

- Mathematics, Computer Science2021 American Control Conference (ACC)
- 2021

This work explores the derivation of distributed parameter system evolution laws (and in particular, partial differential operators and associated partial differential equations, PDEs) from spatiotemporal data through the use of manifold learning techniques in conjunction with neural network learning algorithms.

Physics-Informed Kriging: A Physics-Informed Gaussian Process Regression Method for Data-Model Convergence

- Computer ScienceArXiv
- 2018

The efficiency and accuracy of PhIK are demonstrated for reconstructing a partially known modified Branin function and learning a conservative tracer distribution from sparse concentration measurements and an active learning algorithm that guides the selection of additional observation locations.

Machine Learning of Space-Fractional Differential Equations

- Computer Science, MathematicsSIAM J. Sci. Comput.
- 2019

This work provides a user-friendly and feasible way to perform fractional derivatives of kernels, via a unified set of d-dimensional Fourier integral formulas amenable to generalized Gauss-Laguerre quadrature.

Generative Stochastic Modeling of Strongly Nonlinear Flows with Non-Gaussian Statistics

- Computer ScienceSIAM/ASA J. Uncertain. Quantification
- 2022

A data-driven framework to model stationary chaotic dynamical systems through nonlinear transformations and a set of decoupled stochastic differential equations (SDEs) is proposed, suggesting that this class of models provide an efficient hypothesis space for learning strongly nonlinear flows from small amounts of data.

Stochastic Processes Under Linear Differential Constraints : Application to Gaussian Process Regression for the 3 Dimensional Free Space Wave Equation

- Mathematics
- 2021

Let P be a linear differential operator over D Ă R and U “ pUxqxPD a second order stochastic process. In the first part of this article, we prove a new necessary and sufficient condition for all the…

## References

SHOWING 1-10 OF 50 REFERENCES

Machine learning of linear differential equations using Gaussian processes

- Mathematics, Computer ScienceJ. Comput. Phys.
- 2017

Inferring solutions of differential equations using noisy multi-fidelity data

- Computer ScienceJ. Comput. Phys.
- 2017

Statistical analysis of differential equations: introducing probability measures on numerical solutions

- MathematicsStat. Comput.
- 2017

It is shown that a wide variety of existing solvers can be randomised, inducing a probability measure over the solutions of ordinary and partial differential equation models, and the formal means to incorporate this uncertainty in a statistical model and its subsequent analysis are provided.

Kalman filtering and smoothing solutions to temporal Gaussian process regression models

- Computer Science2010 IEEE International Workshop on Machine Learning for Signal Processing
- 2010

This paper shows how temporal Gaussian process regression models in machine learning can be reformulated as linear-Gaussian state space models, which can be solved exactly with classical Kalman filtering theory, and produces an efficient non-parametric learning algorithm.

Sparse Gaussian Processes using Pseudo-inputs

- Computer ScienceNIPS
- 2005

It is shown that this new Gaussian process (GP) regression model can match full GP performance with small M, i.e. very sparse solutions, and it significantly outperforms other approaches in this regime.

Probabilistic ODE Solvers with Runge-Kutta Means

- Computer ScienceNIPS
- 2014

A family of probabilistic numerical methods that instead return a Gauss-Markov process defining a probability distribution over the ODE solution, such that posterior means match the outputs of the Runge-Kutta family exactly, thus inheriting their proven good properties.

Parametric Gaussian process regression for big data

- Computer ScienceComputational Mechanics
- 2019

This work introduces the concept of parametric Gaussian processes (PGP), which is built upon the seemingly self-contradictory idea of making Gaussian processes parametric. The resulting framework is…

Probabilistic numerics and uncertainty in computations

- Computer ScienceProceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
- 2015

It is shown that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance.

Gaussian Processes for Machine Learning

- Computer ScienceAdaptive computation and machine learning
- 2009

The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.

Brittleness of Bayesian Inference Under Finite Information in a Continuous World

- Mathematics, Computer Science
- 2013

It is observed that learning and robustness are antagonistic properties, and optimal lower and upper bounds on posterior values obtained from Bayesian models that exactly capture an arbitrarily large number of finite-dimensional marginals of the data-generating distribution are derived.