Federated Learning from Small Datasets
@article{Kamp2021FederatedLF, title={Federated Learning from Small Datasets}, author={Michael Kamp and Jonas Fischer and Jilles Vreeken}, journal={ArXiv}, year={2021}, volume={abs/2110.03469} }
Federated learning allows multiple parties to collaboratively train a joint model without sharing local data. This enables applications of machine learning in settings of inherently distributed, undisclosable data such as in the medical domain. In practice, joint training is usually achieved by aggregating local models, for which local training objectives have to be in expectation similar to the joint (global) objective. Often, however, local datasets are so small that local objectives differ…
One Citation
Federated and Meta learning over Non-Wireless and Wireless Networks: A Tutorial
- Computer ScienceArXiv
- 2022
This tutorial conducts a comprehensive review on FL, meta learning, and federated meta learning (FedMeta) to leverage how FL/meta-learning/FedMeta can be designed, optimized, and evolved over non-wireless and wireless networks.
References
SHOWING 1-10 OF 44 REFERENCES
Federated Multi-Task Learning under a Mixture of Distributions
- Computer ScienceNeurIPS
- 2021
This work proposes to study federated MTL under theible assumption that each local data distribution is a mixture of unknown underlying distributions, which encompasses most of the existing personalized FL approaches and leads to federated EM-like algorithms for both client-server and fully decentralized settings.
FedBN: Federated Learning on Non-IID Features via Local Batch Normalization
- Computer ScienceICLR
- 2021
This work proposes an effective method that uses local batch normalization to alleviate the feature shift before averaging models, called FedBN, which outperforms both classical FedAvg, as well as the state-of-the-art for non-iid data (FedProx) on the authors' extensive experiments.
FLOP: Federated Learning on Medical Datasets using Partial Networks
- Computer ScienceKDD
- 2021
This work proposes a simple yet effective algorithm, named Federated Learning on Medical Datasets using Partial Networks (FLOP), that shares only a partial model between the server and clients and can allow different hospitals to collaboratively and effectively train a partially shared model without sharing local patients' data.
Robust Federated Learning: The Case of Affine Distribution Shifts
- Computer ScienceNeurIPS
- 2020
This paper considers a structured affine distribution shift in users' data that captures the device-dependent data heterogeneity in federated settings and proposes a Federated Learning framework Robust to Affine distribution shifts (FLRA) that is provably robust against affine Wasserstein shifts to the distribution of observed samples.
Differentially Private Federated Learning: A Client Level Perspective
- Computer ScienceArXiv
- 2017
The aim is to hide clients' contributions during training, balancing the trade-off between privacy loss and model performance, and empirical studies suggest that given a sufficiently large number of participating clients, this procedure can maintain client-level differential privacy at only a minor cost in model performance.
Federated Optimization in Heterogeneous Networks
- Computer ScienceMLSys
- 2020
This work introduces a framework, FedProx, to tackle heterogeneity in federated networks, and provides convergence guarantees for this framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work.
Migrating Models: A Decentralized View on Federated Learning
- Computer SciencePKDD/ECML Workshops
- 2021
This work presents the approach for transforming the general training algorithm of FL into a peer-to-peer-like process, and shows that omitting central coordination in FL is feasible.
Robust Aggregation for Federated Learning
- Computer ScienceIEEE Transactions on Signal Processing
- 2022
The experiments show that RFA is competitive with the classical aggregation when the level of corruption is low, while demonstrating greater robustness under high corruption, and establishes the convergence of the robust federated learning algorithm for the stochastic learning of additive models with least squares.
Challenges, Applications and Design Aspects of Federated Learning: A Survey
- Computer ScienceIEEE Access
- 2021
This paper reviews existing contemporary works in related areas of federated learning to understand the challenges and topics emphasized by each type of FL survey, and categorizes FL research in terms of challenges, design factors, and applications.
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
- Computer ScienceAISTATS
- 2020
FedPAQ is presented, a communication-efficient Federated Learning method with Periodic Averaging and Quantization that achieves near-optimal theoretical guarantees for strongly convex and non-convex loss functions and empirically demonstrate the communication-computation tradeoff provided by the method.