# Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample

@inproceedings{Berahas2019QuasiNewtonMF, title={Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample}, author={A. Berahas and Majid Jahani and Peter Richt{\'a}rik and Martin Tak'avc}, year={2019} }

We present two sampled quasi-Newton methods (sampled LBFGS and sampled LSR1) for solving empirical risk minimization problems that arise in machine learning. Contrary to the classical variants of these methods that sequentially build Hessian or inverse Hessian approximations as the optimization progresses, our proposed methods sample points randomly around the current iterate at every iteration to produce these approximations. As a result, the approximations constructed make use of more… Expand

#### Figures and Tables from this paper

#### References

SHOWING 1-10 OF 53 REFERENCES

SONIA: A Symmetric Blockwise Truncated Optimization Algorithm

- Computer Science, Mathematics
- AISTATS
- 2021

A robust multi-batch L-BFGS method for machine learning*

- Mathematics, Computer Science
- Optim. Methods Softw.
- 2020

An investigation of Newton-Sketch and subsampled Newton methods

- Mathematics, Computer Science
- Optim. Methods Softw.
- 2020

Efficient Distributed Hessian Free Algorithm for Large-scale Empirical Risk Minimization via Accumulating Sample Strategy

- Computer Science, Mathematics
- AISTATS
- 2020

Newton-type methods for non-convex optimization under inexact Hessian information

- Mathematics, Computer Science
- Math. Program.
- 2020

Scaling Up Quasi-newton Algorithms: Communication Efficient Distributed SR1

- Computer Science, Mathematics
- LOD
- 2020

Second-Order Optimization for Non-Convex Machine Learning: An Empirical Study

- Computer Science, Mathematics
- SDM
- 2020