# Inverse-Dirichlet Weighting Enables Reliable Training of Physics Informed Neural Networks

@article{Maddu2021InverseDirichletWE, title={Inverse-Dirichlet Weighting Enables Reliable Training of Physics Informed Neural Networks}, author={Suryanarayana Maddu and Dominik Sturm and Christian L. M{\"u}ller and Ivo F. Sbalzarini}, journal={ArXiv}, year={2021}, volume={abs/2107.00940} }

We characterize and remedy a failure mode that may arise from multi-scale dynamics with scale imbalances during training of deep neural networks, such as Physics Informed Neural Networks (PINNs). PINNs are popular machine-learning templates that allow for seamless integration of physical equation models with data. Their training amounts to solving an optimization problem over a weighted sum of data-fidelity and equation-fidelity objectives. Conflicts between objectives can arise from scale… Expand

#### Figures from this paper

#### References

SHOWING 1-10 OF 56 REFERENCES

Understanding and mitigating gradient pathologies in physics-informed neural networks

- Computer Science, Mathematics
- SIAM Journal on Scientific Computing
- 2021

This work reviews recent advances in scientific machine learning with a specific focus on the effectiveness of physics-informed neural networks in predicting outcomes of physical systems and discovering hidden physics from noisy data and proposes a novel neural network architecture that is more resilient to gradient pathologies. Expand

Physics informed deep learning for computational elastodynamics without labeled data

- Mathematics, Computer Science
- ArXiv
- 2020

A physics-informed neural network with mixed-variable output to model elastodynamics problems without resort to labeled data, in which the I/BCs are hardly imposed, and results show the promise of PINN in the context of computational mechanics applications. Expand

Multi-Fidelity Physics-Constrained Neural Network and Its Application in Materials Modeling

- Journal of Mechanical Design
- 2019

Training machine learning tools such as neural networks require the availability of sizable data, which can be difficult for engineering and scientific applications where experiments or simulations… Expand

fPINNs: Fractional Physics-Informed Neural Networks

- Physics, Mathematics
- SIAM J. Sci. Comput.
- 2019

This work extends PINNs to fractional PINNs (fPINNs) to solve space-time fractional advection-diffusion equations (fractional ADEs), and demonstrates their accuracy and effectiveness in solving multi-dimensional forward and inverse problems with forcing terms whose values are only known at randomly scattered spatio-temporal coordinates (black-box forcing terms). Expand

On the Spectral Bias of Neural Networks

- Computer Science, Mathematics
- ICML
- 2019

This work shows that deep ReLU networks are biased towards low frequency functions, and studies the robustness of the frequency components with respect to parameter perturbation, to develop the intuition that the parameters must be finely tuned to express high frequency functions. Expand

Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks

- Computer Science, Mathematics
- SIAM J. Sci. Comput.
- 2020

Two new Physics-Informed Neural Networks (PINNs) are proposed for solving time-dependent SPDEs, namely the NN-DO/BO methods, which incorporate the DO/BO constraints into the loss function with an implicit form instead of generating explicit expressions for the temporal derivatives of the Do/BO modes. Expand

Sobolev Training for Neural Networks

- Computer Science, Mathematics
- NIPS
- 2017

Sobolev Training for neural networks is introduced, which is a method for incorporating target derivatives in addition the to target values while training, and results in models with higher accuracy and stronger generalisation on three distinct domains. Expand

GradNorm: Gradient Normalization for Adaptive Loss Balancing in Deep Multitask Networks

- Computer Science
- ICML
- 2018

A gradient normalization (GradNorm) algorithm that automatically balances training in deep multitask models by dynamically tuning gradient magnitudes is presented, showing that for various network architectures, for both regression and classification tasks, and on both synthetic and real datasets, GradNorm improves accuracy and reduces overfitting across multiple tasks. Expand

Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data

- Mathematics, Physics
- 2020

Abstract Numerical simulations on fluid dynamics problems primarily rely on spatially or/and temporally discretization of the governing equation using polynomials into a finite-dimensional algebraic… Expand

Systems biology informed deep learning for inferring parameters and hidden dynamics

- Computer Science, Biology
- 2019

This work has developed a new systems-biology-informed deep learning algorithm that incorporates the system of ordinary differential equations into the neural networks and effectively add constraints to the optimization algorithm, which makes the method robust to noisy and sparse measurements. Expand