Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence

@article{Pilanci2017NewtonSA,
  title={Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence},
  author={Mert Pilanci and M. Wainwright},
  journal={SIAM J. Optim.},
  year={2017},
  volume={27},
  pages={205-245}
}
  • Mert Pilanci, M. Wainwright
  • Published 2017
  • Mathematics, Computer Science
  • SIAM J. Optim.
  • We propose a randomized second-order method for optimization known as the Newton Sketch: it is based on performing an approximate Newton step using a randomly projected or sub-sampled Hessian. For self-concordant functions, we prove that the algorithm has super-linear convergence with exponentially high probability, with convergence and complexity guarantees that are independent of condition numbers and related problem-dependent quantities. Given a suitable initialization, similar guarantees… CONTINUE READING
    132 Citations
    Randomized sketch descent methods for non-separable linearly constrained optimization.
    • 2
    • PDF
    Generalized Hessian approximations via Stein's lemma for constrained minimization
    • M. Erdogdu
    • Mathematics, Computer Science
    • 2017 Information Theory and Applications Workshop (ITA)
    • 2017
    Iterative Hessian Sketch: Fast and Accurate Solution Approximation for Constrained Least-Squares
    • 129
    • PDF
    Newton-type methods for non-convex optimization under inexact Hessian information
    • 92
    • PDF
    Fast and Furious Convergence: Stochastic Second Order Methods under Interpolation
    • 6
    • PDF
    Globally Convergent Newton Methods for Ill-conditioned Generalized Self-concordant Losses
    • 6
    • Highly Influenced
    • PDF
    Sub-Sampled Newton Methods I: Globally Convergent Algorithms
    • 73
    • PDF
    A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization
    • 9
    • PDF

    References

    SHOWING 1-10 OF 48 REFERENCES
    Iterative Hessian Sketch: Fast and Accurate Solution Approximation for Constrained Least-Squares
    • 129
    • PDF
    Randomized sketches of convex programs with sharp guarantees
    • 115
    • PDF
    A Stochastic Quasi-Newton Method for Large-Scale Optimization
    • 261
    • PDF
    Truncated-newtono algorithms for large-scale unconstrained optimization
    • 399
    Interior-point polynomial algorithms in convex programming
    • 3,111
    • PDF
    Faster least squares approximation
    • 314
    • PDF
    A Stochastic Quasi-Newton Method for Online Convex Optimization
    • 311
    • PDF
    A sparse Johnson: Lindenstrauss transform
    • 162
    • PDF
    An Interior-Point Method for Large-Scale $\ell_1$-Regularized Least Squares
    • 1,486
    • PDF