Corpus ID: 220364048

Weak error analysis for stochastic gradient descent optimization algorithms

@article{Bercher2020WeakEA,
  title={Weak error analysis for stochastic gradient descent optimization algorithms},
  author={Aritz Bercher and Lukas Gonon and A. Jentzen and Diyora Salimova},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.02723}
}
  • Aritz Bercher, Lukas Gonon, +1 author Diyora Salimova
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • Stochastic gradient descent (SGD) type optimization schemes are fundamental ingredients in a large number of machine learning based algorithms. In particular, SGD type optimization schemes are frequently employed in applications involving natural language processing, object and face recognition, fraud detection, computational advertisement, and numerical approximations of partial differential equations. In mathematical convergence results for SGD type optimization schemes there are usually two… CONTINUE READING

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 117 REFERENCES
    Neuroradiology: The Requisites
    270
    A trait-based approach for modelling microbial litter decomposition.
    229
    Antiplatelet autoantibodies elicited by dengue virus non-structural protein 1 cause thrombocytopenia and mortality in mice.
    106
    CarSwob : " Ubiquitous , metamorphic configurations "
    • 1999
    ERP Measures Assay the Degree of Expectancy Violation of Harmonic Contexts in Music
    129
    Rational Coefficient Dual-Tree Complex Wavelet Transform: Design and Implementation
    22