Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems

@article{Davis2017ProximallyGS,
  title={Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems},
  author={Damek Davis and Benjamin Grimmer},
  journal={CoRR},
  year={2017},
  volume={abs/1707.03505}
}
In this paper we introduce a stochastic projected subgradient method for weakly convex (i.e., uniformly prox-regular) nonsmooth, nonconvex functions—a wide class of functions which includes the additive and convex composite classes. At a high-level the method is an inexact proximal point iteration in which the strongly convex proximal subproblems are quickly solved with a specialized stochastic projected subgradient method. The primary contribution of this paper is a simple proof that the… CONTINUE READING
Recent Discussions
This paper has been referenced on Twitter 14 times over the past 90 days. VIEW TWEETS

From This Paper

Figures, tables, results, connections, and topics extracted from this paper.
6 Extracted Citations
36 Extracted References
Similar Papers

Referenced Papers

Publications referenced by this paper.
Showing 1-10 of 36 references

Stochastic methods for composite optimization problems

  • John Duchi, Feng Ruan
  • arXiv preprint arXiv:1703.08570,
  • 2017
Highly Influential
3 Excerpts

Catalyst acceleration for gradient-based non-convex optimization

  • Courtney Paquette, Hongzhou Lin, Dmitriy Drusvyatskiy, Julien Mairal, Zaid Harchaoui
  • arXiv preprint arXiv:1703.10993,
  • 2017
1 Excerpt

Similar Papers

Loading similar papers…