Make Workers Work Harder: Decoupled Asynchronous Proximal Stochastic Gradient Descent

@article{Li2016MakeWW,
  title={Make Workers Work Harder: Decoupled Asynchronous Proximal Stochastic Gradient Descent},
  author={Yitan Li and Linli Xu and Xiaowei Zhong and Qing Ling},
  journal={CoRR},
  year={2016},
  volume={abs/1605.06619}
}
Asynchronous parallel optimization algorithms for solving large-scale machine learning problems have drawn significant attention from academia to industry recently. This paper proposes a novel algorithm, decoupled asynchronous proximal stochastic gradient descent (DAP-SGD), to minimize an objective function that is the composite of the average of multiple empirical losses and a regularization term. Unlike the traditional asynchronous proximal stochastic gradient descent (TAP-SGD) in which the… CONTINUE READING
Related Discussions
This paper has been referenced on Twitter 7 times. VIEW TWEETS

From This Paper

Figures, tables, and topics from this paper.

References

Publications referenced by this paper.
Showing 1-10 of 21 references

A delayed proximal gradient method with linear convergence rate

2014 IEEE International Workshop on Machine Learning for Signal Processing (MLSP) • 2014
View 1 Excerpt

Similar Papers

Loading similar papers…