• Corpus ID: 219721485

Multipole Graph Neural Operator for Parametric Partial Differential Equations

@article{Li2020MultipoleGN,
  title={Multipole Graph Neural Operator for Parametric Partial Differential Equations},
  author={Zong-Yi Li and Nikola B. Kovachki and Kamyar Azizzadenesheli and Burigede Liu and Kaushik Bhattacharya and Andrew Stuart and Anima Anandkumar},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.09535}
}
One of the main challenges in using deep learning-based methods for simulating physical systems and solving partial differential equations (PDEs) is formulating physics-based data in the desired structure for neural networks. Graph neural networks (GNNs) have gained popularity in this area since graphs offer a natural way of modeling particle interactions and provide a clear way of discretizing the continuum models. However, the graphs constructed for approximating such tasks usually ignore… 

Figures and Tables from this paper

Efficient Long-Range Convolutions for Point Clouds

TLDR
A novel neural network layer is presented that directly incorporates long-range information for a point cloud and leverages the convolutional theorem coupled with the non-uniform Fourier transform, and can be performed in nearly-linear time asymptotically with respect to the number of input points.

Importance Weight Estimation and Generalization in Domain Adaptation Under Label Shift

  • K. Azizzadenesheli
  • Computer Science
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2022
TLDR
A series of methods to estimate the importance weights from labeled source to unlabeled target domain and provide confidence bounds for these estimators are proposed and deployed.

An extensible Benchmarking Graph-Mesh dataset for studying Steady-State Incompressible Navier-Stokes Equations

TLDR
This paper proposes a 2-D graph-mesh dataset to study the airflow over airfoils at high Reynolds regime and introduces metrics on the stress forces over the airfoil in order to evaluate GDL models on important physical quantities.

N EURAL O PERATOR WITH R EGULARITY S TRUCTURE FOR M ODELING D YNAMICS D RIVEN BY SPDE S

TLDR
This work proposes the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs and demonstrates that the NORS is resolution-invariant, efficient, and achieves one order of magnitude lower error with a modest amount of data.

Neural Operator with Regularity Structure for Modeling Dynamics Driven by SPDEs

TLDR
This work proposes the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs and demonstrates that the NORS is resolution-invariant, efficient, and achieves one order of magnitude lower error with a modest amount of data.

Multiwavelet-based Operator Learning for Differential Equations

TLDR
A multiwavelet-based neural operator learning scheme that compresses the associated operator’s kernel using fine-grained wavelets that achieves state-of-the-art in a range of datasets and exploits the fundamental properties of the operator's kernel which enable numerically efficient representation.

Discretization-independent surrogate modeling over complex geometries using hypernetworks and implicit representations

TLDR
This work proposes alternative deep-learning based surrogate models for discretization-independent, continuous representations of PDE solutions, which can be used for learning and prediction over domains with complex, variable geometry and mesh topology.

MULTI-SCALE PHYSICAL REPRESENTATIONS FOR AP-PROXIMATING PDE SOLUTIONS WITH GRAPH NEU-RAL OPERATORS AP-PROXIMATING SOLUTIONS WITH GRAPH

TLDR
This work studies three multi-resolution schema with integral kernel operators that can be approximated with Message Passing Graph Neural Networks (MPGNNs) and makes extensive MPGNNs experiments to validate the study.

L EARNED C OARSE M ODELS FOR E FFICIENT T URBULENCE S IMULATION

TLDR
Broadly, the proposed model can simulate turbulent dynamics more accurately than classical numerical solvers at the comparably low resolutions across various scientifically relevant metrics.

N ON -L INEAR O PERATOR A PPROXIMATIONS FOR I NI TIAL V ALUE P ROBLEMS

TLDR
This work proposes a Pad´e approximation based exponential neural operator scheme for efficiently learning the map between a given initial condition and the activities at a later time and shows theoretically that the gradients associated with the recurrent Pad ´ e network are bounded across the recurrent horizon.
...

References

SHOWING 1-10 OF 48 REFERENCES

A multiscale neural network based on hierarchical nested bases

TLDR
A multiscale artificial neural network for high-dimensional nonlinear maps based on the idea of hierarchical nested bases in the fast multipole method and the H2-matrices to efficiently approximate discretized non linear maps arising from partial differential equations or integral equations.

EikoNet: Solving the Eikonal Equation With Deep Neural Networks

TLDR
EikoNet is a deep learning approach to solving the Eikonal equation, which characterizes the first-arrival-time field in heterogeneous 3-D velocity structures, and exploits the differentiability of neural networks to calculate the spatial gradients analytically, meaning that the network can be trained on its own without ever needing solutions from a finite-difference algorithm.

Neural Operator: Graph Kernel Network for Partial Differential Equations

TLDR
The key innovation in this work is that a single set of network parameters, within a carefully designed network architecture, may be used to describe mappings between infinite-dimensional spaces and between different finite-dimensional approximations of those spaces.

Unsupervised Deep Learning Algorithm for PDE-based Forward and Inverse Problems

TLDR
A neural network-based algorithm for solving forward and inverse problems for partial differential equations in unsupervised fashion, focusing on 2D second order elliptical system with non-constant coefficients, with application to Electrical Impedance Tomography.

MESHFREEFLOWNET: A Physics-Constrained Deep Continuous Space-Time Super-Resolution Framework

TLDR
This work proposes MESHFREEFLOWNET, a novel deep learning-based super-resolution framework to generate continuous (grid-free) spatio-temporal solutions from the lowresolution inputs, and provides an opensource implementation of the method that supports arbitrary combinations of PDE constraints.

DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators

TLDR
This work proposes deep operator networks (DeepONets) to learn operators accurately and efficiently from a relatively small dataset, and demonstrates that DeepONet significantly reduces the generalization error compared to the fully-connected networks.

Solving ill-posed inverse problems using iterative deep neural networks

TLDR
The method builds on ideas from classical regularization theory and recent advances in deep learning to perform learning while making use of prior information about the inverse problem encoded in the forward operator, noise model and a regularizing functional to results in a gradient-like iterative scheme.

The Incremental Multiresolution Matrix Factorization Algorithm

TLDR
This work uncovers hierarchical block structure in symmetric matrices one feature at a time, and hence scales well to large matrices, and describes how this multiscale analysis goes much farther than what a direct global factorization of the data can identify.

Convolutional Neural Networks for Steady Flow Approximation

TLDR
This work proposes a general and flexible approximation model for real-time prediction of non-uniform steady laminar flow in a 2D or 3D domain based on convolutional neural networks (CNNs), and shows that convolutionAL neural networks can estimate the velocity field two orders of magnitude faster than a GPU-accelerated CFD solver and four orders of order than a CPU-based CFDsolver at a cost of a low error rate.