Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization

@article{Devarakonda2018AvoidingSI,
  title={Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization},
  author={Aditya Devarakonda and K. Fountoulakis and J. Demmel and Michael W. Mahoney},
  journal={2018 IEEE International Parallel and Distributed Processing Symposium (IPDPS)},
  year={2018},
  pages={409-418}
}
Parallel computing has played an important role in speeding up convex optimization methods for big data analytics and large-scale machine learning (ML). However, the scalability of these optimization methods is inhibited by the cost of communicating and synchronizing processors in a parallel setting. Iterative ML methods are particularly sensitive to communication cost since they often require communication every iteration. In this work, we extend well-known techniques from Communication… Expand
Avoiding Communication in Logistic Regression
  • Aditya Devarakonda, J. Demmel
  • Computer Science
  • 2020 IEEE 27th International Conference on High Performance Computing, Data, and Analytics (HiPC)
  • 2020
Avoiding Communication in First Order Methods for Optimization
Parallel and Communication Avoiding Least Angle Regression

References

SHOWING 1-10 OF 41 REFERENCES
Avoiding Communication in Proximal Methods for Convex Optimization Problems
Communication lower bounds and optimal algorithms for numerical linear algebra*†
CA-SVM: Communication-Avoiding Support Vector Machines on Distributed Systems
Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
  • Y. Nesterov
  • Mathematics, Computer Science
  • SIAM J. Optim.
  • 2012
Parallel Iterative S-Step Methods for Unsymmetric Linear Systems
Proximal Algorithms
Accelerated, Parallel, and Proximal Coordinate Descent
...
1
2
3
4
5
...