Corpus ID: 211505855

Fast Linear Convergence of Randomized BFGS

@article{Kovalev2020FastLC,
  title={Fast Linear Convergence of Randomized BFGS},
  author={D. Kovalev and Robert Mansel Gower and Peter Richt{\'a}rik and A. Rogozin},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.11337}
}
Since the late 1950's when quasi-Newton methods first appeared, they have become one of the most widely used and efficient algorithmic paradigms for unconstrained optimization. Despite their immense practical success, there is little theory that shows why these methods are so efficient. We provide a semi-local rate of convergence for the randomized BFGS method which can be significantly better than that of gradient descent, finally giving theoretical evidence supporting the superior empirical… Expand
5 Citations

Figures from this paper

Faster Explicit Superlinear Convergence for Greedy and Random Quasi-Newton Methods
  • PDF
Asynchronous Parallel Stochastic Quasi-Newton Methods
  • Highly Influenced
  • PDF
Stochastic second-order optimization for over-parameterized machine learning models

References

SHOWING 1-10 OF 30 REFERENCES
A Linearly-Convergent Stochastic L-BFGS Algorithm
  • 129
  • PDF
A Stochastic Quasi-Newton Method for Large-Scale Optimization
  • 281
  • PDF
Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms
  • 48
  • PDF
Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
  • 143
  • PDF
Conditioning of Quasi-Newton Methods for Function Minimization
  • 2,728
  • PDF
Linearly Convergent Randomized Iterative Methods for Computing the Pseudoinverse
  • 20
  • PDF
Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions
  • 25
  • PDF
Stochastic Block BFGS: Squeezing More Curvature out of Data
  • 103
  • PDF
...
1
2
3
...