Adding vs. Averaging in Distributed Primal-Dual Optimization

Abstract

Distributed optimization methods for large-scale machine learning suffer from a communication bottleneck. It is difficult to reduce this bottleneck while still efficiently and accurately aggregating partial work from different machines. In this paper , we present a novel generalization of the recent communication-efficient primal-dual framework (COCOA) for distributed optimization. Our framework, COCOA + , allows for additive combination of local updates to the global parameters at each iteration, whereas previous schemes only allow conservative averaging. We give stronger (primal-dual) convergence rate guarantees for both COCOA as well as our new variants , and generalize the theory for both methods to cover non-smooth convex loss functions. We provide an extensive experimental comparison that shows the markedly improved performance of COCOA + on several real-world distributed datasets, especially when scaling up the number of machines.

Extracted Key Phrases

Showing 1-10 of 30 extracted citations
010203020152016
Citations per Year

Citation Velocity: 18

Averaging 18 citations per year over the last 2 years.

Learn more about how we calculate this metric in our FAQ.