Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates

@article{Ghanbari2018ProximalQM,
  title={Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates},
  author={Hiva Ghanbari and Katya Scheinberg},
  journal={Comp. Opt. and Appl.},
  year={2018},
  volume={69},
  pages={597-627}
}
A general, inexact, efficient proximal quasi-Newton algorithm for composite optimization problems has been proposed by Scheinberg and Tang [Math. Program., 160 (2016), pp. 495-529] and a sublinear global convergence rate has been established. In this paper, we analyze the global convergence rate of this method, both in the exact and inexact setting, in the case when the objective function is strongly convex. We also investigate a practical variant of this method by establishing a simple… CONTINUE READING
BETA

References

Publications referenced by this paper.
SHOWING 1-10 OF 26 REFERENCES

Optimization algorithms in machine learning

  • H. Ghanbari, K. Scheinberg
  • Doctoral Dissertation,
  • 2016
2 Excerpts

Similar Papers

Loading similar papers…