• Corpus ID: 219980318

Exact Support Recovery in Federated Regression with One-shot Communication

@article{Barik2020ExactSR,
  title={Exact Support Recovery in Federated Regression with One-shot Communication},
  author={Adarsh Barik and Jean Honorio},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.12583}
}
Federated learning provides a framework to address the challenges of distributed computing, data ownership and privacy over a large number of distributed clients with low computational and communication capabilities. In this paper, we study the problem of learning the exact support of sparse linear regression in the federated learning setup. We provide a simple communication efficient algorithm which only needs one-shot communication with the centralized server to compute the exact support. Our… 

Figures from this paper

Federated Myopic Community Detection with One-shot Communication
TLDR
This paper provides an efficient algorithm, which computes a consensus signed weighted graph from clients evidence, and recovers the underlying network structure in the central server and provides a novel Cheeger-type inequality for general signed weighted graphs.

References

SHOWING 1-10 OF 17 REFERENCES
Federated Learning: Strategies for Improving Communication Efficiency
TLDR
Two ways to reduce the uplink communication costs are proposed: structured updates, where the user directly learns an update from a restricted space parametrized using a smaller number of variables, e.g. either low-rank or a random mask; and sketched updates, which learn a full model update and then compress it using a combination of quantization, random rotations, and subsampling.
Agnostic Federated Learning
TLDR
This work proposes a new framework of agnostic federated learning, where the centralized model is optimized for any target distribution formed by a mixture of the client distributions, and shows that this framework naturally yields a notion of fairness.
Federated Optimization: Distributed Optimization Beyond the Datacenter
We introduce a new and increasingly relevant setting for distributed optimization in machine learning, where the data defining the optimization are distributed (unevenly) over an extremely large
Federated Multi-Task Learning
TLDR
This work shows that multi-task learning is naturally suited to handle the statistical challenges of this setting, and proposes a novel systems-aware optimization method, MOCHA, that is robust to practical systems issues.
COLA: Decentralized Linear Learning
TLDR
This work proposes COLA, a new decentralized training algorithm with strong theoretical guarantees and superior practical performance, that achieves communication efficiency, scalability, elasticity as well as resilience to changes in data and allows for unreliable and heterogeneous participating devices.
Analyzing Federated Learning through an Adversarial Lens
TLDR
This work explores the threat of model poisoning attacks on federated learning initiated by a single, non-colluding malicious agent where the adversarial objective is to cause the model to misclassify a set of chosen inputs with high confidence.
Communication-Efficient Learning of Deep Networks from Decentralized Data
TLDR
This work presents a practical method for the federated learning of deep networks based on iterative model averaging, and conducts an extensive empirical evaluation, considering five different model architectures and four datasets.
CoCoA: A General Framework for Communication-Efficient Distributed Optimization
TLDR
This work presents a general-purpose framework for distributed computing environments, CoCoA, that has an efficient communication scheme and is applicable to a wide variety of problems in machine learning and signal processing, and extends the framework to cover general non-strongly-convex regularizers, including L1-regularized problems like lasso.
Fair Resource Allocation in Federated Learning
TLDR
This work proposes q-Fair Federated Learning (q-FFL), a novel optimization objective inspired by fair resource allocation in wireless networks that encourages a more fair accuracy distribution across devices in federated networks.
Federated Learning: Challenges, Methods, and Future Directions
TLDR
The unique characteristics and challenges of federated learning are discussed, a broad overview of current approaches are provided, and several directions of future work that are relevant to a wide range of research communities are outlined.
...
1
2
...