Linearly convergent away-step conditional gradient for non-strongly convex functions

@article{Beck2017LinearlyCA,
  title={Linearly convergent away-step conditional gradient for non-strongly convex functions},
  author={Amir Beck and Shimrit Shtern},
  journal={Math. Program.},
  year={2017},
  volume={164},
  pages={1-27}
}
We consider the problem of minimizing a function, which is the sum of a linear function and a composition of a strongly convex function with a linear transformation, over a compact polyhedral set. Jaggi and Lacoste-Julien [14] showed that the conditional gradient method with away steps employed on the aforementioned problem without the additional linear term has linear rate of convergence, depending on the so-called pyramidal width of the feasible set. We revisit this result and provide a… CONTINUE READING
Highly Cited
This paper has 39 citations. REVIEW CITATIONS

From This Paper

Figures, tables, results, connections, and topics extracted from this paper.
24 Extracted Citations
20 Extracted References
Similar Papers

Citing Papers

Publications influenced by this paper.
Showing 1-10 of 24 extracted citations

Referenced Papers

Publications referenced by this paper.
Showing 1-10 of 20 references

An affine invariant linear convergence analysis for Frank-Wolfe algorithms

  • S. Lacoste-Julien, M. Jaggi
  • NIPS 2013 Workshop on Greedy Algorithms, Frank…
  • 2014
Highly Influential
12 Excerpts

Constrained minimization methods

  • E. Levitin, B. T. Polyak
  • USSR Computational Mathematics and Mathematical…
  • 1966
Highly Influential
4 Excerpts

Foundations of Optimization

  • O. Güler
  • Graduate Texts in Mathematics. Springer, New York…
  • 2010

Similar Papers

Loading similar papers…