Corpus ID: 201310409

# A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization

@article{Xie2019AGA,
title={A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization},
author={Guangzeng Xie and Luo Luo and Zhihua Zhang},
journal={ArXiv},
year={2019},
volume={abs/1908.08394}
}
• Published 2019
• Mathematics, Computer Science
• ArXiv
• This paper studies the lower bound complexity for the optimization problem whose objective function is the average of $n$ individual smooth convex functions. We consider the algorithm which gets access to gradient and proximal oracle for each individual component. For the strongly-convex case, we prove such an algorithm can not reach an $\varepsilon$-suboptimal point in fewer than $\Omega((n+\sqrt{\kappa n})\log(1/\varepsilon))$ iterations, where $\kappa$ is the condition number of the… CONTINUE READING
3 Citations

#### References

SHOWING 1-10 OF 11 REFERENCES
An optimal randomized incremental gradient method
• G. Lan, Yi Zhou
• Mathematics, Computer Science
• Math. Program.
• 2018
• 144
• Highly Influential
• PDF
A Lower Bound for the Optimization of Finite Sums
• Mathematics, Computer Science
• ICML
• 2015
• 94
• Highly Influential
• PDF
Lower bounds for finding stationary points I
• Mathematics, Computer Science
• Math. Program.
• 2020
• 98
• PDF
Minimizing finite sums with the stochastic average gradient
• Mathematics, Computer Science
• Math. Program.
• 2017
• 786
• PDF
Tight Complexity Bounds for Optimizing Composite Objectives
• Mathematics, Computer Science
• NIPS
• 2016
• 124
• Highly Influential
• PDF
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
• Mathematics, Computer Science
• SIAM J. Optim.
• 2014
• 549
• PDF