Physics-informed neural networks with hard constraints for inverse design

@article{Lu2021PhysicsinformedNN,
  title={Physics-informed neural networks with hard constraints for inverse design},
  author={Lu Lu and Rapha{\"e}l Pestourie and Wenjie Yao and Zhicheng Wang and Francesc Verdugo and Steven G. Johnson},
  journal={SIAM J. Sci. Comput.},
  year={2021},
  volume={43},
  pages={B1105-B1132}
}
Inverse design arises in a variety of areas in engineering such as acoustic, mechanics, thermal/electronic transport, electromagnetism, and optics. Topology optimization is a major form of inverse design, where we optimize a designed geometry to achieve targeted properties and the geometry is parameterized by a density function. This optimization is challenging, because it has a very high dimensionality and is usually constrained by partial differential equations (PDEs) and additional… 
Fast PDE-constrained optimization via self-supervised operator learning
TLDR
This work leverage physics-informed deep operator networks (DeepONets) – a self-supervised framework for learning the solution operator of parametric PDEs – to build fast and differentiable surrogates for rapidly solving PDE-constrained optimization problems, even in the absence of any paired input-output training data.
Model-Constrained Deep Learning Approaches for Inverse Problems
TLDR
This short communication introduces several model-constrained DL approaches—including both feed-forward DNN and autoencoders—that are capable of learning not only information hidden in the training data but also in the underlying mathematical models to solve inverse problems.
Physics-enhanced deep surrogates for PDEs
TLDR
A “physics-enhanced deep-surrogate” (“PEDS”) approach towards developing fast surrogate models for complex physical systems described by partial differential equations (PDEs) and similar models is presented and it is found that a PEDS surrogate can be trained with at least ∼ 10× less data than a “black-box” neural network for the same accuracy.
Neural Networks with Physics-Informed Architectures and Constraints for Dynamical Systems Modeling
TLDR
A framework to learn dynamics models from trajectory data while incorporating a-priori system knowledge as inductive bias is developed and learns to predict the system dynamics two orders of magnitude more accurately than a baseline approach that does not include prior knowledge, given the same training dataset.
Meta-Auto-Decoder for Solving Parametric Partial Differential Equations
TLDR
This work treats solving parametric PDEs as a meta-learning problem and utilizes the Auto-Decoder structure in [1] to deal with different tasks/PDEs, and exhibits faster convergence speed without losing the accuracy compared with other deep learning methods.
Contracting Neural-Newton Solver
TLDR
This paper develops a recurrent NN simulation tool, termed the Contracting Neural-Newton Solver (CoNNS), which explicitly captures the contracting nature of these Newton iterations, and concludes by showing that the underlying NN has the capacity to universally approximate any nonlinear system of equations which contract.
Characterizing possible failure modes in physics-informed neural networks
TLDR
It is demonstrated that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena for even slightly more complex problems.
Physics-informed neural networks (PINNs) for fluid mechanics: A review
TLDR
The effectiveness of physics-informed neural networks (PINNs) for inverse problems related to three-dimensional wake flows, supersonic flows, and biomedical flows is demonstrated.
DAE-PINN: A Physics-Informed Neural Network Model for Simulating Differential-Algebraic Equations with Application to Power Networks
TLDR
DAE-PINN is developed, the first effective deep-learning framework for learning and simulating the solution trajectories of nonlinear differential-algebraic equations (DAE), which present a form of infinite stiffness and describe, for example, the dynamics of power networks.
SPINN: Sparse, Physics-based, and partially Interpretable Neural Networks for PDEs
TLDR
The SPINN model is introduced, which serves as a seamless bridge between two extreme modeling tools for PDEs, namely dense neural network based methods like Physics Informed Neural Networks (PINNs) and traditional mesh-free numerical methods, thereby providing a novel means to develop a new class of hybrid algorithms that build on the best of both these viewpoints.
...
1
2
3
...

References

SHOWING 1-10 OF 53 REFERENCES
Physics-informed machine learning
Despite great progress in simulating multiphysics problems using the numerical discretization of partial differential equations (PDEs), one still cannot seamlessly incorporate noisy data into
Quantifying total uncertainty in physics-informed neural networks for solving forward and inverse stochastic problems
TLDR
A new method is proposed with the objective of endowing the DNN with uncertainty quantification for both sources of uncertainty, i.e., the parametric uncertainty and the approximation uncertainty, which can be readily applied to other types of stochastic PDEs in multi-dimensions.
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
Abstract We introduce physics-informed neural networks – neural networks that are trained to solve supervised learning tasks while respecting any given laws of physics described by general nonlinear
DeepXDE: A Deep Learning Library for Solving Differential Equations
TLDR
An overview of physics-informed neural networks (PINNs), which embed a PDE into the loss of the neural network using automatic differentiation, and a new residual-based adaptive refinement (RAR) method to improve the training efficiency of PINNs.
Simulator-based training of generative neural networks for the inverse design of metasurfaces
TLDR
This work presents a new type of population-based global optimization algorithm for metasurfaces that is enabled by the training of a generative neural network and observes that the distribution of devices generated by the network continuously shifts towards high performance design space regions over the course of optimization.
Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks
TLDR
Two new Physics-Informed Neural Networks (PINNs) are proposed for solving time-dependent SPDEs, namely the NN-DO/BO methods, which incorporate the DO/BO constraints into the loss function with an implicit form instead of generating explicit expressions for the temporal derivatives of the Do/BO modes.
Physics-informed neural networks for inverse problems in nano-optics and metamaterials.
TLDR
The emerging paradigm of physics-informed neural networks (PINNs) are employed for the solution of representative inverse scattering problems in photonic metamaterials and nano-optics technologies and successfully apply mesh-free PINNs to the difficult task of retrieving the effective permittivity parameters of a number of finite-size scattering systems.
Deep learning enabled inverse design in nanophotonics
TLDR
The recent progress in the application of deep learning to the inverse design of nanophotonic devices is discussed, mainly focusing on the three existing learning paradigms of supervised-, unsupervised-, and reinforcement learning.
Topology Optimization Accelerated by Deep Learning
TLDR
It is numerically shown that the computational cost for the topology optimization can be reduced without the loss of optimization quality.
Multiscale topology optimization using neural network surrogate models
TLDR
Because the derivative of the surrogate model is important for sensitivity analysis of the macroscale topology optimization, a neural network training procedure based on the Sobolev norm is described, and an alternative method is developed to enable creation of void regions.
...
1
2
3
4
5
...