• Publications
  • Influence
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
Abstract We introduce physics-informed neural networks – neural networks that are trained to solve supervised learning tasks while respecting any given laws of physics described by general nonlinearExpand
Physics Informed Deep Learning (Part I): Data-driven Solutions of Nonlinear Partial Differential Equations
TLDR
This two part treatise introduces physics informed neural networks – neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations and demonstrates how these networks can be used to infer solutions topartial differential equations, and obtain physics-informed surrogate models that are fully differentiable with respect to all input coordinates and free parameters. Expand
Physics Informed Deep Learning (Part II): Data-driven Discovery of Nonlinear Partial Differential Equations
We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partialExpand
Physics-Constrained Deep Learning for High-dimensional Surrogate Modeling and Uncertainty Quantification without Labeled Data
TLDR
This paper provides a methodology that incorporates the governing equations of the physical model in the loss/likelihood functions of the model predictive density and the reference conditional density as a minimization problem of the reverse Kullback-Leibler (KL) divergence. Expand
Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling
TLDR
A probabilistic framework based on Gaussian process regression and nonlinear autoregressive schemes that is capable of learning complex nonlinear and space-dependent cross-correlations between models of variable fidelity, and can effectively safeguard against low-fidelity models that provide wrong trends is put forth. Expand
Machine learning of linear differential equations using Gaussian processes
TLDR
Gaussian process priors are modified according to the particular form of such operators and are employed to infer parameters of the linear equations from scarce and possibly noisy observations, leading to model discovery from just a handful of noisy measurements. Expand
Understanding and mitigating gradient pathologies in physics-informed neural networks
TLDR
This work reviews recent advances in scientific machine learning with a specific focus on the effectiveness of physics-informed neural networks in predicting outcomes of physical systems and discovering hidden physics from noisy data and proposes a novel neural network architecture that is more resilient to gradient pathologies. Expand
Multistep Neural Networks for Data-driven Discovery of Nonlinear Dynamical Systems
The process of transforming observed data into predictive mathematical models of the physical world has always been paramount in science and engineering. Although data is currently being collected atExpand
Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations
TLDR
The method circumvents the need for spatial discretization of the differential operators by proper placement of Gaussian process priors and is an attempt to construct structured and data-efficient learning machines, which are explicitly informed by the underlying physics that possibly generated the observed data. Expand
Adversarial Uncertainty Quantification in Physics-Informed Neural Networks
TLDR
A deep learning framework for quantifying and propagating uncertainty in systems governed by non-linear differential equations using physics-informed neural networks uses latent variable models to construct probabilistic representations for the system states, and puts forth an adversarial inference procedure for training them on data. Expand
...
1
2
3
4
5
...