Solving high-dimensional eigenvalue problems using deep neural networks: A diffusion Monte Carlo like approach

@article{Han2020SolvingHE,
  title={Solving high-dimensional eigenvalue problems using deep neural networks: A diffusion Monte Carlo like approach},
  author={Jiequn Han and Jianfeng Lu and Mo Zhou},
  journal={J. Comput. Phys.},
  year={2020},
  volume={423},
  pages={109792}
}

Figures and Tables from this paper

A semigroup method for high dimensional elliptic PDEs and eigenvalue problems based on neural networks
Reproducing Activation Function for Deep Learning
TLDR
The proposed reproducing activation function can facilitate the convergence of deep learning optimization for a solution with higher accuracy than existing deep learning solvers for audio/image/video reconstruction, PDEs, and eigenvalue problems.
Interpolating between BSDEs and PINNs - deep learning for elliptic and parabolic boundary value problems
TLDR
This paper reviews the literature and suggests a methodology based on the novel diffusion loss that interpolates between BSDEs and PINNs, which opens the door towards a unified understanding of numerical approaches for high-dimensional PDEs, as well as for implementations that combine the strengths of BS DEs andPINNs.
Solving eigenvalue PDEs of metastable diffusion processes using artificial neural networks
TLDR
A numerical algorithm based on training artificial neural networks for solving the leading eigenvalues and eigenfunctions of such high-dimensional eigenvalue problem.
Convergence of the deep BSDE method for coupled FBSDEs
TLDR
A posteriori error estimation of the solution is provided and it is proved that the error converges to zero given the universal approximation capability of neural networks.
A Priori Generalization Error Analysis of Two-Layer Neural Networks for Solving High Dimensional Schrödinger Eigenvalue Problems
  • Jianfeng Lu, Yulong Lu
  • Mathematics, Computer Science
    Communications of the American Mathematical Society
  • 2022
TLDR
It is proved that the convergence rate of the generalization error is independent of dimension and under the a priori assumption that the ground state lies in a spectral Barron space, which is achieved by a fixed point argument based on the Krein-Rutman theorem.
Full Configuration Interaction Excited-State Energies in Large Active Spaces from Randomized Subspace Iteration
from Randomized Subspace Iteration Samuel M. Greene,1 Robert J. Webber,2, a) James E. T. Smith,3 Jonathan Weare,2, b) and Timothy C. Berkelbach1, 3, c) 1)Department of Chemistry, Columbia University,
Full Configuration Interaction Excited-State Energies in Large Active Spaces from Subspace Iteration with Repeated Random Sparsification
We present a stable and systematically improvable quantum Monte Carlo (QMC) approach to calculating excited-state energies, which we implement using our fast randomized iteration method for the full
Learning the random variables in Monte Carlo simulations with stochastic gradient descent: Machine learning for parametric PDEs and financial derivative pricing
TLDR
A new numerical approximation strategy for parametric approximation problems including the parametric financial pricing problems described above is introduced and it is illustrated by means of several numerical experiments that the introduced approximation strategy achieves a very high accuracy for a variety of high-dimensional parametric approximate problems, even in the L∞-norm.
...
1
2
3
...

References

SHOWING 1-10 OF 25 REFERENCES
‘W’
  • P. Alam
  • Composites Engineering: An A–Z Guide
  • 2021
Convergence of the deep BSDE method for coupled FBSDEs
TLDR
A posteriori error estimation of the solution is provided and it is proved that the error converges to zero given the universal approximation capability of neural networks.
Deep-neural-network solution of the electronic Schrödinger equation
TLDR
High-accuracy quantum chemistry methods struggle with a combinatorial explosion of Slater determinants in larger molecular systems, but now a method has been developed that learns electronic wavefunctions with deep neural networks and reaches high accuracy with only a few determinants.
Fermionic neural-network states for ab-initio electronic structure
TLDR
An extension of neural-network quantum states to model interacting fermionic problems and use neural-networks to perform electronic structure calculations on model diatomic molecules to achieve chemical accuracy.
Ab-Initio Solution of the Many-Electron Schrödinger Equation with Deep Neural Networks
TLDR
Deep neural networks can improve the accuracy of variational quantum Monte Carlo to the point where it outperforms other ab-initio quantum chemistry methods, opening the possibility of accurate direct optimisation of wavefunctions for previously intractable molecules and solids.
Spectral Inference Networks: Unifying Deep and Spectral Learning
TLDR
The results demonstrate that Spectral Inference Networks accurately recover eigenfunctions of linear operators and can discover interpretable representations from video in a fully unsupervised manner.
Solving high-dimensional partial differential equations using deep learning
TLDR
A deep learning-based approach that can handle general high-dimensional parabolic PDEs using backward stochastic differential equations and the gradient of the unknown solution is approximated by neural networks, very much in the spirit of deep reinforcement learning with the gradient acting as the policy function.
Deep Learning-Based Numerical Methods for High-Dimensional Parabolic Partial Differential Equations and Backward Stochastic Differential Equations
We study a new algorithm for solving parabolic partial differential equations (PDEs) and backward stochastic differential equations (BSDEs) in high dimension, which is based on an analogy between the
Deep learning-based numerical methods for highdimensional parabolic partial differential equations and backward stochastic differential equations
  • Communications in Mathematics and Statistics,
  • 2017
...
1
2
3
...