Corpus ID: 236469602

Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample

@inproceedings{Berahas2019QuasiNewtonMF,
  title={Quasi-Newton Methods for Machine Learning: Forget the Past, Just Sample},
  author={A. Berahas and Majid Jahani and Peter Richt{\'a}rik and Martin Tak'avc},
  year={2019}
}
We present two sampled quasi-Newton methods (sampled LBFGS and sampled LSR1) for solving empirical risk minimization problems that arise in machine learning. Contrary to the classical variants of these methods that sequentially build Hessian or inverse Hessian approximations as the optimization progresses, our proposed methods sample points randomly around the current iterate at every iteration to produce these approximations. As a result, the approximations constructed make use of more… Expand

References

SHOWING 1-10 OF 53 REFERENCES
A robust multi-batch L-BFGS method for machine learning*
An investigation of Newton-Sketch and subsampled Newton methods
Sub-sampled Newton methods
Block BFGS Methods
...
1
2
3
4
5
...