Corpus ID: 211096666

Exponential Step Sizes for Non-Convex Optimization

@article{Li2020ExponentialSS,
  title={Exponential Step Sizes for Non-Convex Optimization},
  author={Xiaoyu Li and Zhenxun Zhuang and Francesco Orabona},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.05273}
}
Stochastic Gradient Descent (SGD) is a popular tool in large scale optimization of machine learning objective functions. However, the performance is greatly variable, depending on the choice of the step sizes. In this paper, we introduce the exponential step sizes for stochastic optimization of smooth non-convex functions which satisfy the Polyak-Łojasiewicz (PL) condition. We show that, without any information on the level of noise over the stochastic gradients, these step sizes guarantee a… Expand
8 Citations
Super-Convergence with an Unstably Large Learning Rate
  • PDF
Super-Convergence with an Unstable Learning Rate
  • PDF
Shuffling Gradient-Based Methods with Momentum
  • 2
  • Highly Influenced
  • PDF
Meta-LR-Schedule-Net: Learned LR Schedules that Scale and Generalize
  • 1
  • PDF
A Theoretical Analysis of Learning with Noisily Labeled Data
  • PDF
WeMix: How to Better Utilize Data Augmentation
  • PDF
Towards Understanding Label Smoothing
  • 4
  • PDF

References

SHOWING 1-10 OF 58 REFERENCES
Deep Frank-Wolfe For Neural Network Optimization
  • 17
  • PDF
On the Convergence of Adam and Beyond
  • 1,084
  • PDF
Stability and Generalization of Learning Algorithms that Converge to Global Optima
  • 44
  • PDF
Better Theory for SGD in the Nonconvex World
  • 15
  • PDF
Adam: A Method for Stochastic Optimization
  • 63,217
  • PDF
Stagewise Training Accelerates Convergence of Testing Error Over SGD
  • Tianbao Yang, Yan Yan, Zhuoning Yuan, Rong Jin
  • Computer Science, Mathematics
  • NeurIPS
  • 2019
  • 26
  • PDF
The Step Decay Schedule: A Near Optimal, Geometrically Decaying Learning Rate Procedure
  • 41
  • PDF
Non-Asymptotic Analysis of Stochastic Approximation Algorithms for Machine Learning
  • 486
  • PDF
...
1
2
3
4
5
...