Corpus ID: 204904238

# An Adaptive and Momental Bound Method for Stochastic Learning

@article{Ding2019AnAA,
title={An Adaptive and Momental Bound Method for Stochastic Learning},
author={Jianbang Ding and Xuancheng Ren and Ruixuan Luo and X. Sun},
journal={ArXiv},
year={2019},
volume={abs/1910.12249}
}
Training deep neural networks requires intricate initialization and careful selection of learning rates. The emergence of stochastic gradient optimization methods that use adaptive learning rates based on squared past gradients, e.g., AdaGrad, AdaDelta, and Adam, eases the job slightly. However, such methods have also been proven problematic in recent studies with their own pitfalls including non-convergence issues and so on. Alternative variants have been proposed for enhancement, such as… Expand
8 Citations

#### References

SHOWING 1-10 OF 43 REFERENCES