A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization
@article{Xie2019AGA, title={A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization}, author={Guangzeng Xie and Luo Luo and Zhihua Zhang}, journal={ArXiv}, year={2019}, volume={abs/1908.08394} }
This paper studies the lower bound complexity for the optimization problem whose objective function is the average of $n$ individual smooth convex functions. We consider the algorithm which gets access to gradient and proximal oracle for each individual component. For the strongly-convex case, we prove such an algorithm can not reach an $\varepsilon$-suboptimal point in fewer than $\Omega((n+\sqrt{\kappa n})\log(1/\varepsilon))$ iterations, where $\kappa$ is the condition number of the… CONTINUE READING
3 Citations
Tight Lower Complexity Bounds for Strongly Convex Finite-Sum Optimization
- Computer Science, Mathematics
- ArXiv
- 2020
- PDF
PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization
- Computer Science, Mathematics
- ArXiv
- 2020
- 3
- PDF
References
SHOWING 1-10 OF 11 REFERENCES
An optimal randomized incremental gradient method
- Mathematics, Computer Science
- Math. Program.
- 2018
- 144
- Highly Influential
- PDF
A Lower Bound for the Optimization of Finite Sums
- Mathematics, Computer Science
- ICML
- 2015
- 94
- Highly Influential
- PDF
Minimizing finite sums with the stochastic average gradient
- Mathematics, Computer Science
- Math. Program.
- 2017
- 786
- PDF
Tight Complexity Bounds for Optimizing Composite Objectives
- Mathematics, Computer Science
- NIPS
- 2016
- 124
- Highly Influential
- PDF
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Mathematics, Computer Science
- SIAM J. Optim.
- 2014
- 549
- PDF
Linear Convergence with Condition Number Independent Access of Full Gradients
- Mathematics, Computer Science
- NIPS
- 2013
- 104
- PDF
Katyusha X: Practical Momentum Method for Stochastic Sum-of-Nonconvex Optimization
- Computer Science, Mathematics
- ICML
- 2018
- 31
- PDF
Accelerating Stochastic Gradient Descent using Predictive Variance Reduction
- Mathematics, Computer Science
- NIPS
- 2013
- 1,643
- PDF
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
- Mathematics, Computer Science
- NIPS
- 2014
- 1,025
- PDF