Adrien B. Taylor

Learn More
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex programs. Finding the worst-case performance of a black-box first-order method is formulated as an optimization problem over a set of smooth (strongly) convex(More)
We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant(More)
The second part of the thesis is devoted to solving PEPs for obtaining practical guarantees on the behavior of optimization schemes: PEPs can be solved exactly for a large family of first-order methods. In particular, tight guarantees can be obtained for fixed-step methods involving projected, proximal, conditional, inexact, and decentralized (sub)gradient(More)
  • 1