• Publications
  • Influence
Introductory Lectures on Convex Optimization - A Basic Course
  • Y. Nesterov
  • Computer Science
  • Applied Optimization
  • 9 April 2014
TLDR
It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization, and it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. Expand
Smooth minimization of non-smooth functions
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 1 May 2005
TLDR
A new approach for constructing efficient schemes for non-smooth convex optimization is proposed, based on a special smoothing technique, which can be applied to functions with explicit max-structure, and can be considered as an alternative to black-box minimization. Expand
Interior-point polynomial algorithms in convex programming
TLDR
This book describes the first unified theory of polynomial-time interior-point methods, and describes several of the new algorithms described, e.g., the projective method, which have been implemented, tested on "real world" problems, and found to be extremely efficient in practice. Expand
Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
  • Y. Nesterov
  • Mathematics, Computer Science
  • SIAM J. Optim.
  • 24 April 2012
TLDR
Surprisingly enough, for certain classes of objective functions, the proposed methods for solving huge-scale optimization problems are better than the standard worst-case bounds for deterministic algorithms. Expand
Gradient methods for minimizing composite objective function
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, andExpand
Gradient methods for minimizing composite functions
  • Y. Nesterov
  • Mathematics, Computer Science
  • Math. Program.
  • 2013
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another isExpand
Random Gradient-Free Minimization of Convex Functions
TLDR
New complexity bounds for methods of convex optimization based only on computation of the function value are proved, which appears that such methods usually need at most n times more iterations than the standard gradient methods, where n is the dimension of the space of variables. Expand
Cubic regularization of Newton method and its global performance
TLDR
This paper provides theoretical analysis for a cubic regularization of Newton method as applied to unconstrained minimization problem and proves general local convergence results for this scheme. Expand
Primal-dual subgradient methods for convex problems
  • Y. Nesterov
  • Computer Science, Mathematics
  • Math. Program.
  • 7 April 2009
TLDR
A new approach for constructing subgradient schemes for different types of nonsmooth problems with convex structure that is primal-dual since they are always able to generate a feasible approximation to the optimum of an appropriately formulated dual problem. Expand
...
1
2
3
4
5
...