Solving high-dimensional partial differential equations using deep learning

  title={Solving high-dimensional partial differential equations using deep learning},
  author={Jiequn Han and Arnulf Jentzen and Weinan E},
  journal={Proceedings of the National Academy of Sciences},
  pages={8505 - 8510}
Significance Partial differential equations (PDEs) are among the most ubiquitous tools used in modeling problems in nature. However, solving high-dimensional PDEs has been notoriously difficult due to the “curse of dimensionality.” This paper introduces a practical algorithm for solving nonlinear PDEs in very high (hundreds and potentially thousands of) dimensions. Numerical results suggest that the proposed algorithm is quite effective for a wide variety of problems, in terms of both accuracy… 

Figures and Tables from this paper

Space-time deep neural network approximations for high-dimensional partial differential equations

The main result of this work proves for every a and b that solutions of certain Kolmogorov PDEs can be approximated by DNNs on the space-time region $[0,T]\times [a,b]^d$ without the curse of dimensionality.

Algorithms for solving high dimensional PDEs: from nonlinear Monte Carlo to machine learning

It is demonstrated to the reader that studying PDEs as well as control and variational problems in very high dimensions might very well be among the most promising new directions in mathematics and scientific computing in the near future.

Partial Differential Equations Meet Deep Neural Networks: A Survey

This survey aims to categorize and review the current progress on Deep Neural Networks (DNNs) for PDEs in a common taxonomy, followed by an overview and classifying of applications of these related methods in scientific research and engineering scenarios.

Numerical solution for high-dimensional partial differential equations based on deep learning with residual learning and data-driven learning

A novel method based on residual neural network learning with data-driven learning elliptic PDEs on a box-shaped domain that can increase the training spatial points according to the regional error and improve the accuracy of the model.

DeepSets and Their Derivative Networks for Solving Symmetric PDEs

This paper designs deep learning algorithms based on certain types of neural networks, named PointNet and DeepSet (and their associated derivative networks), for computing simultaneously an approximation of the solution and its gradient to symmetric PDEs.

Deep learning-based method coupled with small sample learning for solving partial differential equations

A deep learning-based general numerical method coupled with small sample learning for solving PDEs, which approximate the solution via a deep feedforward neural network, which is trained to satisfy the P DEs with the initial and boundary conditions.

A Local Deep Learning Method for Solving High Order Partial Differential Equations

This work proposes a novel approach to solving PDEs with high order derivatives by simultaneously approximating the function value and derivatives by introducing intermediate variables into a system of low order differential equations as what is done in the local discontinuous Galerkin method.

Neural Time-Dependent Partial Differential Equation

This work proposes a sequence deep learning framework called Neural-PDE, which allows to automatically learn governing rules of any time-dependent PDE system from existing data by using a bidirectional LSTM encoder, and predict the next n time steps data.

Finite Expression Method for Solving High-Dimensional Partial Differential Equations

It is proved in approximation theory that FEX can avoid the curse of dimensionality and a deep reinforcement learning method is proposed to implement FEX for various high-dimensional PDEs in different dimensions, achieving high and even machine accuracy with a memory complexity polynomial in dimension and an amenable time complexity.



On Multilevel Picard Numerical Approximations for High-Dimensional Nonlinear Parabolic Partial Differential Equations and High-Dimensional Nonlinear Backward Stochastic Differential Equations

This paper test the applicability of a family of approximation methods based on Picard approximations and multilevel Monte Carlo methods on a variety of 100-dimensional nonlinear PDEs that arise in physics and finance by means of numerical simulations presenting approximation accuracy against runtime.

Algorithms for overcoming the curse of dimensionality for certain Hamilton–Jacobi equations arising in control theory and elsewhere

This work proposes and test methods for solving a large class of the HJ PDE relevant to optimal control problems without the use of grids or numerical approximations and develops a new and equally fast way to find the closest point y lying in the union of a finite number of compact convex sets.


We generalize the primal–dual methodology, which is popular in the pricing of early‐exercise options, to a backward dynamic programming equation associated with time discretization schemes of

Adam: A Method for Stochastic Optimization

This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.

TensorFlow: A system for large-scale machine learning

The TensorFlow dataflow model is described and the compelling performance that Tensor Flow achieves for several real-world applications is demonstrated.

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin.

ImageNet classification with deep convolutional neural networks

A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.

Approximate dynamic programming : solving the curses of dimensionality

This book discusses the challenges of dynamic programming, the three curses of dimensionality, and some experimental comparisons of stepsize formulas that led to the creation of ADP for online applications.

Forward-backward stochastic differential equations and quasilinear parabolic PDEs

Abstract This paper studies, under some natural monotonicity conditions, the theory (existence and uniqueness, a priori estimate, continuous dependence on a parameter) of forward–backward stochastic