On the Convergence and generalization of Physics Informed Neural Networks
@article{Shin2020OnTC, title={On the Convergence and generalization of Physics Informed Neural Networks}, author={Yeonjong Shin and J{\'e}r{\^o}me Darbon and George Em Karniadakis}, journal={ArXiv}, year={2020}, volume={abs/2004.01806} }
Physics informed neural networks (PINNs) are deep learning based techniques for solving partial differential equations (PDEs). Guided by data and physical laws, PINNs find a neural network that approximates the solution to a system of PDEs. Such a neural network is obtained by minimizing a loss function in which any prior knowledge of PDEs and data are encoded. Despite its remarkable empirical success, there is little theoretical justification for PINNs. In this paper, we establish a…
60 Citations
Sobolev Training for Physics Informed Neural Networks
- Computer Science
- 2021
Inspired by the recent studies that incorporate derivative information for the training of neural networks, a loss function is developed that guides a neural network to reduce the error in the corresponding Sobolev space, making the training substantially efficient.
Error analysis for physics-informed neural networks (PINNs) approximating Kolmogorov PDEs
- Computer Science, MathematicsAdvances in Computational Mathematics
- 2022
It is proved that the size of the PINNs and the number of training samples only grow polynomially with the underlying dimension, enabling PINNs to overcome the curse of dimensionality in this context.
Sobolev Training for the Neural Network Solutions of PDEs
- Computer ScienceArXiv
- 2021
This paper develops a loss function that guides a neural network to reduce the error in the corresponding Sobolev space, and provides empirical evidence that shows that the proposed loss function, together with the iterative sampling techniques, performs better in solving high dimensional PDEs.
Learning Physics-Informed Neural Networks without Stacked Back-propagation
- Computer ScienceArXiv
- 2022
A novel approach is developed that can significantly accelerate the training of Physics-Informed Neural Networks by parameterizing the PDE solution by the Gaussian smoothed model and showing that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
On the Role of Fixed Points of Dynamical Systems in Training Physics-Informed Neural Networks
- PhysicsTrans. Mach. Learn. Res.
- 2023
This paper empirically studies commonly observed training difficulties of Physics-Informed Neural Networks (PINNs) on dynamical systems. Our results indicate that fixed points which are inherent to…
Physics Informed Convex Artificial Neural Networks (PICANNs) for Optimal Transport based Density Estimation
- Computer ScienceArXiv
- 2021
This framework is based on Brenier’s theorem, which reduces the continuous OMT problem to that of solving a nonlinear PDE of Monge-Ampere type whose solution is a convex function.
Self-Adaptive Physics-Informed Neural Networks using a Soft Attention Mechanism
- Computer ScienceAAAI Spring Symposium: MLPS
- 2021
A fundamentally new method to train PINNs adaptively, where the adaptation weights are fully trainable, so the neural network learns by itself which regions of the solution are difficult and is forced to focus on them, which is reminiscent of soft multiplicative-mask attention mechanism used in computer vision.
Estimates on the generalization error of Physics Informed Neural Networks (PINNs) for approximating PDEs
- Computer ScienceArXiv
- 2020
An abstract formalism is introduced and stability properties of the underlying PDE are leveraged to derive an estimate for the generalization error of PINNs approximating solutions of the forward problem for PDEs.
When and why PINNs fail to train: A neural tangent kernel perspective
- Computer ScienceJ. Comput. Phys.
- 2022
Learning generative neural networks with physics knowledge
- Computer ScienceResearch in the Mathematical Sciences
- 2022
The proposed PhysGNN is a fully differentiable model that allows back-propagation of gradients through both numerical PDE solvers and generative neural networks, and is trained by minimizing the discrete Wasserstein distance between generated and observed probability distributions of the PDE outputs using the stochastic gradient descent method.
28 References
Understanding and mitigating gradient pathologies in physics-informed neural networks
- Computer ScienceSIAM J. Sci. Comput.
- 2021
This work reviews recent advances in scientific machine learning with a specific focus on the effectiveness of physics-informed neural networks in predicting outcomes of physical systems and discovering hidden physics from noisy data and proposes a novel neural network architecture that is more resilient to gradient pathologies.
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Computer ScienceJ. Comput. Phys.
- 2019
fPINNs: Fractional Physics-Informed Neural Networks
- MathematicsSIAM J. Sci. Comput.
- 2019
This work extends PINNs to fractional PINNs (fPINNs) to solve space-time fractional advection-diffusion equations (fractional ADEs), and demonstrates their accuracy and effectiveness in solving multi-dimensional forward and inverse problems with forcing terms whose values are only known at randomly scattered spatio-temporal coordinates (black-box forcing terms).
DeepXDE: A Deep Learning Library for Solving Differential Equations
- Computer ScienceAAAI Spring Symposium: MLPS
- 2020
An overview of physics-informed neural networks (PINNs), which embed a PDE into the loss of the neural network using automatic differentiation, and a new residual-based adaptive refinement (RAR) method to improve the training efficiency of PINNs.
Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks
- Computer ScienceSIAM J. Sci. Comput.
- 2020
Two new Physics-Informed Neural Networks (PINNs) are proposed for solving time-dependent SPDEs, namely the NN-DO/BO methods, which incorporate the DO/BO constraints into the loss function with an implicit form instead of generating explicit expressions for the temporal derivatives of the Do/BO modes.
Solving high-dimensional partial differential equations using deep learning
- Computer ScienceProceedings of the National Academy of Sciences
- 2018
A deep learning-based approach that can handle general high-dimensional parabolic PDEs using backward stochastic differential equations and the gradient of the unknown solution is approximated by neural networks, very much in the spirit of deep reinforcement learning with the gradient acting as the policy function.
On some neural network architectures that can represent viscosity solutions of certain high dimensional Hamilton-Jacobi partial differential equations
- Computer Science, MathematicsJ. Comput. Phys.
- 2021
A unified deep artificial neural network approach to partial differential equations in complex geometries
- Computer ScienceNeurocomputing
- 2018
DGM: A deep learning algorithm for solving partial differential equations
- Computer ScienceJ. Comput. Phys.
- 2018