• Publications
  • Influence
Federated Optimization in Heterogeneous Networks
TLDR
We propose FedProx, a federated optimization algorithm that addresses the challenges of heterogeneity in federated networks. Expand
LEAF: A Benchmark for Federated Settings
TLDR
We present LEAF, a modular benchmarking framework geared towards learning in massively distributed federated networks of remote devices. Expand
Federated Learning: Challenges, Methods, and Future Directions
TLDR
Federated learning involves training statistical models over remote devices or siloed data centers, such as mobile phones or hospitals, while keeping data localized. Expand
Federated Multi-Task Learning
TLDR
We show that multi-task learning is naturally suited to handle the statistical challenges of this setting, and propose a novel systems-aware optimization method, MOCHA, that is robust to practical systems issues. Expand
Communication-Efficient Distributed Dual Coordinate Ascent
TLDR
In this paper, we propose a communication-efficient framework, CoCoA, that uses local computation in a primal-dual setting to dramatically reduce the amount of necessary communication. Expand
Adding vs. Averaging in Distributed Primal-Dual Optimization
TLDR
In this paper, we present a novel generalization of the recent communication-efficient primal-dual framework (COCOA) for distributed optimization. Expand
Fair Resource Allocation in Federated Learning
TLDR
We propose q-Fair Federated Learning (q-FFL), a novel optimization objective inspired by fair resource allocation in wireless networks that encourages a more fair (specifically, a more uniform) accuracy distribution across devices in federated networks. Expand
MLI: An API for Distributed Machine Learning
TLDR
We present MLI, an Application Programming Interface for building scalable distributed machine learning algorithms in a distributed setting based on data-centric computing. Expand
CoCoA: A General Framework for Communication-Efficient Distributed Optimization
TLDR
We present a general-purpose framework for distributed computing environments, CoCoA, that has an efficient communication scheme and is applicable to a wide variety of problems in machine learning and signal processing. Expand
On the Convergence of Federated Optimization in Heterogeneous Networks
TLDR
We propose and introduce \fedprox, which is similar in spirit to \fedavg, but more amenable to theoretical analysis. Expand
...
1
2
3
4
...