• Corpus ID: 236470180

Federated Multi-Task Learning under a Mixture of Distributions

@article{Marfoq2021FederatedML,
  title={Federated Multi-Task Learning under a Mixture of Distributions},
  author={Othmane Marfoq and Giovanni Neglia and Aur{\'e}lien Bellet and Laetitia Kameni and Richard Vidal},
  journal={ArXiv},
  year={2021},
  volume={abs/2108.10252}
}
The increasing size of data generated by smartphones and IoT devices motivated the development of Federated Learning (FL), a framework for on-device collaborative training of machine learning models. First efforts in FL focused on learning a single global model with good average performance across clients, but the global model may be arbitrarily bad for a given client, due to the inherent heterogeneity of local data distributions. Federated multi-task learning (MTL) approaches can learn… 

Figures and Tables from this paper

Subspace Learning for Personalized Federated Optimization

TLDR
This work proposes a method to address the situation through the lens of ensemble learning based on the construction of a low-loss subspace continuum that generates a high-accuracy ensemble of two endpoints (i.e. global model and local model).

An Empirical Study of Personalized Federated Learning

TLDR
The benchmark tool FedBench is opened, which allows researchers to conduct experimental studies with various experimental settings and shows that there are no champion methods, large data heterogeneity often leads to high accurate predictions, and standard federated learning methods (e.g. FedAvg) with Ne-tuning often outperform personalized federate learning methods.

Handling Data Heterogeneity in Federated Learning via Knowledge Fusion

TLDR
The theoretical analysis and intensive experiments demonstrate that FedKF achieves high model performance, high fairness, and privacy-preserving simultaneously, thereby realizing the global-local knowledge fusion process.

FedMSplit: Correlation-Adaptive Federated Multi-Task Learning across Multimodal Split Networks

TLDR
The FedMSplit framework is proposed, which allows federated training over multimodal distributed data without assuming similar active sensors in all clients, and the key idea is to employ a dynamic and multi-view graph structure to adaptively capture the correlations amongst multimmodal client models.

Federated learning with incremental clustering for heterogeneous data

TLDR
This work proposes FLIC (Federated Learning with Incremental Clustering), in which the server exploits the updates sent by clients during federated training instead of asking them to send their parameters simultaneously, and empirically demonstrates that it is a robust defense against poisoning attacks.

Self-Aware Personalized Federated Learning

TLDR
This work develops a self-aware personalized FL method where each client can automat-ically balance the training of its local personal model and the global model that implicitly con-tributes to other clients’ training.

Differentially Private Federated Learning on Heterogeneous Data

TLDR
This paper focuses on the challenging setting where users communicate with a “honest-but-curious” server without any trusted intermediary and demonstrates the superiority of DP-SCAFFOLD over the state-of-the-art algorithm DP-FedAvg when the number of local updates and the level of heterogeneity grows.

Addressing Client Drift in Federated Continual Learning with Adaptive Optimization

TLDR
This work outlines a framework for performing Federated Continual Learning (FCL) by using NetTailor as a candidate continual learning approach and shows the extent of the problem of client drift, and shows that adaptive federated optimization can reduce the adverse impact of client Drift.

Federated Learning from Small Datasets

TLDR
This work proposes a novel approach that intertwines model aggregations with permutations of local models resulting in more efficient training in data-sparse domains and enables training on extremely small local datasets, such as patient data across hospitals, while retaining the training efficiency and privacy benefits of federated learning.

Towards Method of Horizontal Federated Learning: A Survey

TLDR
Characteristics of different categories of federated learning are described in detail, which enable horizontal federation to be distinguished clearly, and classical optimization methods of horizontal federated model and the problems it faces are summarized.

References

SHOWING 1-10 OF 75 REFERENCES

Agnostic Federated Learning

TLDR
This work proposes a new framework of agnostic federated learning, where the centralized model is optimized for any target distribution formed by a mixture of the client distributions, and shows that this framework naturally yields a notion of fairness.

Personalized Federated Learning: A Meta-Learning Approach

TLDR
A personalized variant of the well-known Federated Averaging algorithm is studied and its performance is characterized by the closeness of underlying distributions of user data, measured in terms of distribution distances such as Total Variation and 1-Wasserstein metric.

Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints

TLDR
Closed FL (CFL), a novel federated multitask learning (FMTL) framework, which exploits geometric properties of the FL loss surface to group the client population into clusters with jointly trainable data distributions, and comes with strong mathematical guarantees on the clustering quality.

Variational Federated Multi-Task Learning

TLDR
In VIRTUAL the federated network of the server and the clients is treated as a star-shaped Bayesian network, and learning is performed on the network using approximated variational inference, and it is shown that this method is effective on real-world federated datasets.

Improving Federated Learning Personalization via Model Agnostic Meta Learning

TLDR
This work points out that the setting of Model Agnostic Meta Learning (MAML), where one optimizes for a fast, gradient-based, few-shot adaptation to a heterogeneous distribution of tasks, has a number of similarities with the objective of personalization for FL.

FedBN: Federated Learning on Non-IID Features via Local Batch Normalization

TLDR
This work proposes an effective method that uses local batch normalization to alleviate the feature shift before averaging models, called FedBN, which outperforms both classical FedAvg, as well as the state-of-the-art for non-iid data (FedProx) on the authors' extensive experiments.

Personalized Federated Learning with First Order Model Optimization

TLDR
This work efficiently calculate optimal weighted model combinations for each client, based on figuring out how much a client can benefit from another's model, to achieve personalization in federated FL.

FedU: A Unified Framework for Federated Multi-Task Learning with Laplacian Regularization

TLDR
This work forms a new FMTL problem FedU using Laplacian regularization, which can explicitly leverage relationships among the clients for multi-task learning and proves that FedU outperforms the vanilla FedAvg, MOCHA, as well as pFedMe and Per-FedAvg in personalized federated learning.

Personalized Federated Learning using Hypernetworks

TLDR
Since hypernetworks share information across clients, it is shown that pFedHN can generalize better to new clients whose distributions differ from any client observed during training, and decouples the communication cost from the trainable model size.

On the Convergence of Federated Optimization in Heterogeneous Networks

TLDR
This work proposes and introduces \fedprox, which is similar in spirit to \fedavg, but more amenable to theoretical analysis, and describes the convergence of \fed Prox under a novel \textit{device similarity} assumption.
...