Optimal Algorithms for Non-Smooth Distributed Optimization in Networks

@inproceedings{Scaman2018OptimalAF,
  title={Optimal Algorithms for Non-Smooth Distributed Optimization in Networks},
  author={Kevin Scaman and Francis Bach and S{\'e}bastien Bubeck and Yin Tat Lee and Laurent Massouli{\'e}},
  booktitle={NeurIPS},
  year={2018}
}
In this work, we consider the distributed optimization of non-smooth convex functions using a network of computing units. We investigate this problem under two regularity assumptions: (1) the Lipschitz continuity of the global objective function, and (2) the Lipschitz continuity of local individual functions. Under the local regularity assumption, we provide the first optimal first-order decentralized algorithm called multi-step primal-dual (MSPD) and its corresponding optimal convergence rate… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 18 references