Gradient methods for minimizing composite objective function

  title={Gradient methods for minimizing composite objective function},
  author={Yu. V. Nesterov},
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known. Despite to the bad properties of the sum, such problems, both in convex and nonconvex cases, can be solved with efficiency typical for the good part of the objective. For convex problems of the above structure, we consider primal and dual… CONTINUE READING
Highly Influential
This paper has highly influenced 106 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 1,171 citations. REVIEW CITATIONS
668 Citations
19 References
Similar Papers


Publications citing this paper.
Showing 1-10 of 668 extracted citations

1,172 Citations

Citations per Year
Semantic Scholar estimates that this publication has 1,172 citations based on the available data.

See our FAQ for additional information.


Publications referenced by this paper.
Showing 1-10 of 19 references

A method for large-scale l1regularized least-squares problems with applications in signal processing and statistics

  • S.-J. Kim, K.Koh, M.Lustig, S.Boyd, D.Gorinevsky
  • 2007
1 Excerpt

Nesterov . Accelerating the cubic regularization of Newton ’ s method on convex problems

  • Yu.
  • Accepted by Mathematical Programming . DOI
  • 2007

Nesterov . Smooth minimization of nonsmooth functions

  • Yu.
  • Mathematical Programming ( A )
  • 2004

Atomic decomposition by basis pursuit

  • S.Chen, D.Donoho, M.Saunders
  • SIAM Journal of Scientific Computation,
  • 1998
1 Excerpt

Interior point polynomial methods in convex programming: Theory and Applications

  • Yu. Nesterov, A. Nemirovskii
  • SIAM, Philadelphia
  • 1994
2 Excerpts

Similar Papers

Loading similar papers…