Optimal Regularized Dual Averaging Methods for Stochastic Optimization

@inproceedings{Chen2012OptimalRD,
  title={Optimal Regularized Dual Averaging Methods for Stochastic Optimization},
  author={Xi Chen and Qihang Lin and Javier Pe{\~n}a},
  booktitle={NIPS},
  year={2012}
}
This paper considers a wide spectrum of regularized stochastic optimization problems where both the loss function and regularizer can be non-smooth. We develop a novel algorithm based on the regularized dual averaging (RDA) method, that can simultaneously achieve the optimal convergence rates for both convex and strongly convex loss. In particular, for strongly convex loss, it achieves the optimal rate of O( 1 N + 1 N2 ) for N iterations, which improves the rate O( logN N ) for previous… CONTINUE READING
Highly Cited
This paper has 29 citations. REVIEW CITATIONS
21 Citations
22 References
Similar Papers

Citations

Publications citing this paper.
Showing 1-10 of 21 extracted citations

References

Publications referenced by this paper.
Showing 1-10 of 22 references

Primal-dual subgradient methods for minimizing uniformly convex functions

  • A. Juditsky, Y. Nesterov
  • 2010
Highly Influential
5 Excerpts

Similar Papers

Loading similar papers…