Corpus ID: 201310409

A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization

@article{Xie2019AGA,
  title={A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization},
  author={Guangzeng Xie and Luo Luo and Zhihua Zhang},
  journal={ArXiv},
  year={2019},
  volume={abs/1908.08394}
}
  • Guangzeng Xie, Luo Luo, Zhihua Zhang
  • Published 2019
  • Mathematics, Computer Science
  • ArXiv
  • This paper studies the lower bound complexity for the optimization problem whose objective function is the average of $n$ individual smooth convex functions. We consider the algorithm which gets access to gradient and proximal oracle for each individual component. For the strongly-convex case, we prove such an algorithm can not reach an $\varepsilon$-suboptimal point in fewer than $\Omega((n+\sqrt{\kappa n})\log(1/\varepsilon))$ iterations, where $\kappa$ is the condition number of the… CONTINUE READING
    3 Citations

    Tables and Topics from this paper

    Tight Lower Complexity Bounds for Strongly Convex Finite-Sum Optimization
    • PDF
    Recent Theoretical Advances in Non-Convex Optimization
    • PDF
    PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization
    • 3
    • PDF

    References

    SHOWING 1-10 OF 11 REFERENCES
    An optimal randomized incremental gradient method
    • G. Lan, Yi Zhou
    • Mathematics, Computer Science
    • Math. Program.
    • 2018
    • 144
    • Highly Influential
    • PDF
    A Lower Bound for the Optimization of Finite Sums
    • 94
    • Highly Influential
    • PDF
    Lower bounds for finding stationary points I
    • 98
    • PDF
    Minimizing finite sums with the stochastic average gradient
    • 786
    • PDF
    Tight Complexity Bounds for Optimizing Composite Objectives
    • 124
    • Highly Influential
    • PDF
    A Proximal Stochastic Gradient Method with Progressive Variance Reduction
    • 549
    • PDF
    Linear Convergence with Condition Number Independent Access of Full Gradients
    • 104
    • PDF
    Katyusha X: Practical Momentum Method for Stochastic Sum-of-Nonconvex Optimization
    • 31
    • PDF
    Accelerating Stochastic Gradient Descent using Predictive Variance Reduction
    • 1,643
    • PDF
    SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
    • 1,025
    • PDF