• Corpus ID: 238744096

Machine Learning For Elliptic PDEs: Fast Rate Generalization Bound, Neural Scaling Law and Minimax Optimality

@article{Lu2021MachineLF,
  title={Machine Learning For Elliptic PDEs: Fast Rate Generalization Bound, Neural Scaling Law and Minimax Optimality},
  author={Yiping Lu and Haoxuan Chen and Jianfeng Lu and Lexing Ying and Jos{\'e} H. Blanchet},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.06897}
}
In this paper, we study the statistical limits of deep learning techniques for solving elliptic partial differential equations (PDEs) from random samples using the Deep Ritz Method (DRM) and Physics-Informed Neural Networks (PINNs). To simplify the problem, we focus on a prototype elliptic PDE: the Schr\"odinger equation on a hypercube with zero Dirichlet boundary condition, which has wide application in the quantum-mechanical systems. We establish upper and lower bounds for both methods, which… 

Figures and Tables from this paper

Uniform Convergence Guarantees for the Deep Ritz Method for Nonlinear Problems
TLDR
This work provides convergence guarantees for the Deep Ritz Method for abstract variational energies such as the p-Laplace equation or the Modica-Mortola energy with essential or natural boundary conditions.
Error Estimates for the Deep Ritz Method with Boundary Penalty
TLDR
Estimates on the error made by the Deep Ritz Method for elliptic problems on the space H(Ω) with different boundary conditions are established and the optimal decay rate of the estimated error is min(s/2, r) and achieved by choosing λn ∼ n.