A Sparsity Preserving Stochastic Gradient Method for Composite Optimization

@inproceedings{Lin2011ASP,
  title={A Sparsity Preserving Stochastic Gradient Method for Composite Optimization},
  author={Qihang Lin and Xi Chen and Javier Pe{\~n}a},
  year={2011}
}
We propose new stochastic gradient algorithms for solving convex composite optimization problems. In each iteration, our algorithms utilize a stochastic oracle of the gradient of the smooth component in the objective function. Our algorithms are based on a stochastic version of the estimate sequence technique introduced by Nesterov (Introductory Lectures on Convex Optimization: A Basic Course, Kluwer, 2003). We establish convergence results for the expectation and variance as well as large… CONTINUE READING
9 Citations
22 References
Similar Papers

References

Publications referenced by this paper.
Showing 1-10 of 22 references

Introductory lectures on convex optimization: a basic course

  • Yurii Nesterov
  • Kluwer Academic Pub,
  • 2003
Highly Influential
10 Excerpts

On accelerated proximal gradient methods for convex-concave optimization

  • Paul Tseng
  • SIAM Journal on Optimization (Submitted),
  • 2008
Highly Influential
6 Excerpts

Similar Papers

Loading similar papers…