On the convergence properties of a K-step averaging stochastic gradient descent algorithm for nonconvex optimization

@inproceedings{Zhou2018OnTC,
  title={On the convergence properties of a K-step averaging stochastic gradient descent algorithm for nonconvex optimization},
  author={Fan Zhou and Guojing Cong},
  booktitle={IJCAI},
  year={2018}
}
Despite their popularity, the practical performance of asynchronous stochastic gradient descent methods (ASGD) for solving large scale machine learning problems are not as good as theoretical results indicate. We adopt and analyze a synchronous K-step averaging stochastic gradient descent algorithm which we call K-AVG. We establish the convergence results of KAVG for nonconvex objectives, and show that it scales much better than ASGD. In addition, we explain why the K-step delay is necessary… CONTINUE READING

Similar Papers

Loading similar papers…