• Corpus ID: 221516165

Multilevel Picard approximations for high-dimensional semilinear second-order PDEs with Lipschitz nonlinearities

@article{Hutzenthaler2020MultilevelPA,
  title={Multilevel Picard approximations for high-dimensional semilinear second-order PDEs with Lipschitz nonlinearities},
  author={Martin Hutzenthaler and Arnulf Jentzen and Thomas Kruse and Tuan Anh Nguyen},
  journal={ArXiv},
  year={2020},
  volume={abs/2009.02484}
}
The recently introduced full-history recursive multilevel Picard (MLP) approximation methods have turned out to be quite successful in the numerical approximation of solutions of high-dimensional nonlinear PDEs. In particular, there are mathematical convergence results in the literature which prove that MLP approximation methods do overcome the curse of dimensionality in the numerical approximation of nonlinear second-order PDEs in the sense that the number of computational operations of the… 

Algorithms for solving high dimensional PDEs: from nonlinear Monte Carlo to machine learning

It is demonstrated to the reader that studying PDEs as well as control and variational problems in very high dimensions might very well be among the most promising new directions in mathematics and scientific computing in the near future.

Full history recursive multilevel Picard approximations for ordinary differential equations with expectations

This work shows for every δ > 0 that the proposed MLP approximation algorithm requires only a computational effort of order ε to achieve a root-mean-square error of size ε.

An overview on deep learning-based approximation methods for partial differential equations

An introduction to this area of research by revisiting selected mathematical results related to deep learning approximation methods for PDEs and reviewing the main ideas of their proofs is provided.

Multilevel Picard approximations for high-dimensional decoupled forward-backward stochastic differential equations

Backward stochastic differential equations (BSDEs) appear in numeruous applications. Classical approximation methods suffer from the curse of dimensionality and deep learning-based approximation

Deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear partial differential equations

We prove that deep neural networks are capable of approximating solutions of semilinear Kolmogorov PDE in the case of gradient-independent, Lipschitz-continuous nonlinearities, while the required

Overcoming the curse of dimensionality in the numerical approximation of backward stochastic differential equations

This article introduces a new Monte Carlo-type numerical approximation method for high-dimensional BSDEs and proves that it does indeed overcome the curse of dimensionality in the approximative computation of solution paths ofBSDEs.

On the speed of convergence of Picard iterations of backward stochastic differential equations

It is a well-established fact in the scientific literature that Picard iterations of backward stochastic differential equations with globally Lipschitz continuous nonlinearity converge at least

Strong $L^p$-error analysis of nonlinear Monte Carlo approximations for high-dimensional semilinear partial differential equations

It is proved that the proposed MLP approximation scheme indeed overcomes the curse of dimensionality in the numerical approximation of highdimensional semilinear PDEs with the approximation error measured in the Lp-sense with p ∈ (0,∞).

References

SHOWING 1-10 OF 66 REFERENCES

Generalised multilevel Picard approximations

An abstract framework is developed in which a generalised class of MLP approximation schemes can be formulated and analysed and applied to derive a computational complexity result for suitable MLP approximations for semi-linear heat equations.

Numerical simulations for full history recursive multilevel Picard approximations for systems of high-dimensional partial differential equations

The presented numerical simulation results indicate that the proposed MLP approximation scheme significantly outperforms certain deep learning based approximation methods for high-dimensional semilinear PDEs.

Overcoming the Curse of Dimensionality in the Numerical Approximation of Parabolic Partial Differential Equations with Gradient-Dependent Nonlinearities

There exists no approximation algorithm in the scientific literature which has been proven to overcome the curse of dimensionality in the case of a class of nonlinear PDEs with general time horizons and gradient-dependent nonlinearities.

Multilevel Picard iterations for solving smooth semilinear parabolic heat equations

We introduce a new family of numerical algorithms for approximating solutions of general high-dimensional semilinear parabolic partial differential equations at single space-time points. The

Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations

This paper proves in the case of semilinear heat equations with gradient-independent and globally Lipschitz continuous nonlinearities that the computational effort of a variant of the recently introduced multilevel Picard approximations grows at most polynomially both in the dimension and in the reciprocal of the required accuracy.

On Multilevel Picard Numerical Approximations for High-Dimensional Nonlinear Parabolic Partial Differential Equations and High-Dimensional Nonlinear Backward Stochastic Differential Equations

This paper test the applicability of a family of approximation methods based on Picard approximations and multilevel Monte Carlo methods on a variety of 100-dimensional nonlinear PDEs that arise in physics and finance by means of numerical simulations presenting approximation accuracy against runtime.

On nonlinear Feynman–Kac formulas for viscosity solutions of semilinear parabolic partial differential equations

The classical Feynman–Kac identity builds a bridge between stochastic analysis and partial differential equations (PDEs) by providing stochastic representations for classical solutions of linear

Overcoming the curse of dimensionality in the approximative pricing of financial derivatives with default risks

An MLP algorithm is introduced for the approximation of solutions of semilinear Black-Scholes equations and it is proved that the computational effort of the method grows at most polynomially both in the dimension and the reciprocal of the prescribed approximation accuracy.

Overcoming the curse of dimensionality in the numerical approximation of Allen–Cahn partial differential equations via truncated full-history recursive multilevel Picard approximations

This work introduces and analyzes truncated variants of the recently introduced full-history recursive multilevel Picard approximation schemes by introducing and analyzing truncated variant of the recent full- history recursiveMultilevelPicard approximation schemes.

A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations

It is proved for the first time that in the case of semilinear heat equations with gradient-independent nonlinearities that the numbers of parameters of the employed deep neural networks grow at most polynomially in both the PDE dimension and the reciprocal of the prescribed approximation accuracy.
...