Limited-Memory BFGS with Displacement Aggregation.
@article{Berahas2019LimitedMemoryBW, title={Limited-Memory BFGS with Displacement Aggregation.}, author={Albert S. Berahas and Frank E. Curtis and Baoyu Zhou}, journal={arXiv: Optimization and Control}, year={2019} }
A displacement aggregation strategy is proposed for the curvature pairs stored in a limited-memory BFGS method such that the resulting (inverse) Hessian approximations are equal to those that would be derived from a full-memory BFGS method. This means that, if a sufficiently large number of pairs are stored, then an optimization algorithm employing the limited-memory method can achieve the same theoretical convergence properties as when full-memory (inverse) Hessian approximations are stored… CONTINUE READING
References
SHOWING 1-10 OF 42 REFERENCES
Stochastic Block BFGS: Squeezing More Curvature out of Data
- Mathematics, Computer Science
- ICML
- 2016
- 97
- PDF
A Stochastic Quasi-Newton Method for Large-Scale Optimization
- Mathematics, Computer Science
- SIAM J. Optim.
- 2016
- 269
- PDF
A numerical study of limited memory BFGS methods
- Mathematics, Computer Science
- Appl. Math. Lett.
- 2002
- 46
Global convergence of online limited memory BFGS
- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2015
- 94
- PDF
A Self-Correcting Variable-Metric Algorithm for Stochastic Optimization
- Mathematics, Computer Science
- ICML
- 2016
- 24
- PDF
New limited memory bundle method for large-scale nonsmooth optimization
- Mathematics, Computer Science
- Optim. Methods Softw.
- 2004
- 96
- PDF
A robust multi-batch L-BFGS method for machine learning*
- Mathematics, Computer Science
- Optim. Methods Softw.
- 2020
- 10
- PDF