Fast-and-Light Stochastic ADMM

  title={Fast-and-Light Stochastic ADMM},
  author={Shuai Zheng and James T. Kwok},
The alternating direction method of multipliers (ADMM) is a powerful optimization solver in machine learning. Recently, stochastic ADMM has been integrated with variance reduction methods for stochastic gradient, leading to SAG-ADMM and SDCA-ADMM that have fast convergence rates and low iteration complexities. However, their space requirements can still be high. In this paper, we propose an integration of ADMM with the method of stochastic variance reduced gradient (SVRG). Unlike another recent… CONTINUE READING
Recent Discussions
This paper has been referenced on Twitter 8 times over the past 90 days. VIEW TWEETS
10 Citations
29 References
Similar Papers


Publications referenced by this paper.
Showing 1-10 of 29 references

Distributed optimization and statistical learning via the alternating direction method of multipliers

  • S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein
  • Foundations and Trends in Machine Learning, 3(1…
  • 2011
Highly Influential
5 Excerpts


  • O. Russakovsky, J. Deng, +7 authors M. Bernstein
  • C. Berg, and Li F.-F. Imagenet large scale visual…
  • 2015
1 Excerpt

Similar Papers

Loading similar papers…