Approximation properties of Residual Neural Networks for Kolmogorov PDEs

@article{Baggenstos2021ApproximationPO,
  title={Approximation properties of Residual Neural Networks for Kolmogorov PDEs},
  author={Jonas Baggenstos and Diyora Salimova},
  journal={ArXiv},
  year={2021},
  volume={abs/2111.00215}
}
In recent years residual neural networks (ResNets) as introduced by He et al. [17] have become very popular in a large number of applications, including in image classification and segmentation. They provide a new perspective in training very deep neural networks without suffering the vanishing gradient problem. In this article we show that ResNets are able to approximate solutions of Kolmogorov partial differential equations (PDEs) with constant diffusion and possibly nonlinear drift… 

Figures from this paper

References

SHOWING 1-10 OF 33 REFERENCES

A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients

It is revealed that DNNs do overcome the curse of dimensionality in the numerical approximation of Kolmogorov PDEs with constant diffusion and nonlinear drift coefficients.

Space-time deep neural network approximations for high-dimensional partial differential equations

The main result of this work proves for every a and b that solutions of certain Kolmogorov PDEs can be approximated by DNNs on the space-time region $[0,T]\times [a,b]^d$ without the curse of dimensionality.

Space-time error estimates for deep neural network approximations for differential equations

The subject of the main result of this article is to provide space-time error estimates for DNN approximation of Euler approximations of certain perturbed differential equations.

Solving the Kolmogorov PDE by Means of Deep Learning

A numerical approximation method is derived and proposed which aims to overcome both of the above mentioned drawbacks and intends to deliver a numerical approximation of the Kolmogorov PDE on an entire region without suffering from the curse of dimensionality.

Deep neural network approximation for high-dimensional elliptic PDEs with boundary conditions

It is shown that deep neural networks are capable of representing solutions of the Poisson equation without incurring the curse of dimension and the proofs are based on a probabilistic representation of the solution to thePoisson equation as well as a suitable sampling method.

A Theoretical Analysis of Deep Neural Networks and Parametric PDEs

The existence of a small reduced basis is used to construct neural networks that yield approximations of the parametric solution maps in such a way that the sizes of these networks essentially only depend on the size of the reduced basis.

On the space-time expressivity of ResNets

It is shown that by increasing the number of residual blocks as well as their expressivity the solution of an arbitrary ODE can be approximated in space and time simultaneously by deep ReLU ResNets.

Data Driven Governing Equations Approximation Using Deep Neural Networks

A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations

It is proved, for the first time, that ANNs do indeed overcome the curse of dimensionality in the numerical approximation of Black-Scholes PDEs.