Corpus ID: 203610298

Full error analysis for the training of deep neural networks

@article{Beck2019FullEA,
  title={Full error analysis for the training of deep neural networks},
  author={Christan Beck and A. Jentzen and B. Kuckuck},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.00121}
}
Deep learning algorithms have been applied very successfully in recent years to a range of problems out of reach for classical solution paradigms. Nevertheless, there is no completely rigorous mathematical error and convergence analysis which explains the success of deep learning algorithms. The error of a deep learning algorithm can in many situations be decomposed into three parts, the approximation error, the generalization error, and the optimization error. In this work we estimate for a… Expand
14 Citations
Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation
  • 7
  • PDF
Strong overall error analysis for the training of artificial neural networks via random initializations
  • PDF
Non-convergence of stochastic gradient descent in the training of deep neural networks
  • 7
  • PDF
Multilevel Monte Carlo learning
  • Highly Influenced
  • PDF
The gap between theory and practice in function approximation with deep neural networks
  • 8
  • PDF
Partition of unity networks: deep hp-approximation
  • 1
  • PDF
...
1
2
...

References

SHOWING 1-10 OF 77 REFERENCES
Space-time error estimates for deep neural network approximations for differential equations
  • 23
  • PDF
Deep neural network approximations for Monte Carlo algorithms
  • 20
  • PDF
Approximation spaces of deep neural networks
  • 30
  • PDF
Deep Neural Network Approximation Theory
  • 52
  • PDF
Optimal approximation of piecewise smooth functions using deep ReLU neural networks
  • 180
  • PDF
Non-strongly-convex smooth stochastic approximation with convergence rate O(1/n)
  • 280
  • PDF
...
1
2
3
4
5
...