• Corpus ID: 246015639

Variance-Reduced Heterogeneous Federated Learning via Stratified Client Selection

  title={Variance-Reduced Heterogeneous Federated Learning via Stratified Client Selection},
  author={Guangyuan Shen and Dehong Gao and Libin Yang and Fang Zhou and Duanxiao Song and Wei Lou and Shirui Pan},
via Stratified Client Selection Guangyuan Shen , Dehong Gao , Libin Yang ∗ , Fang Zhou , Duanxiao Song , Wei Lou and Shirui Pan Department of Cyber Science and Technology, Northwestern Polytechnical University, China Alibaba Group, China Department of Computing, The Hong Kong Polytechnic University, Hong Kong, China Department of Data Science and AI, Faculty of IT, Monash University, Australia {gyshen, libiny, zhoufang, songduanxiao}@mail.nwpu.edu.cn, dehong.gdh@alibaba-inc.com, csweilou@comp… 

An EMD-Based Adaptive Client Selection Algorithm for Federated Learning in Heterogeneous Data Scenarios

An adaptive client selection algorithm for federated learning in statistical heterogeneous scenarios called ACSFed is proposed in this paper, which can enable the federated model to learn the global statistical knowledge faster and thereby promote the convergence of the Federated model.



Client Selection Based on Label Quantity Information for Federated Learning

A new client selection method called grouping based scheduling (GS) scheme is proposed, with which clients are divided into several groups based on a new metric called group earth mover’s distance (GEMD), which can improve the performance of FL algorithms, compared to the random scheduling scheme.

Budgeted Online Selection of Candidate IoT Clients to Participate in Federated Learning

This work solves the problem of optimizing accuracy in stateful FL with a budgeted number of candidate clients by selecting the best candidate clients in terms of test accuracy to participate in the training process and proposed heuristic outperforms the online random algorithm with up to 27% gain in accuracy.

Optimizing Federated Learning on Device Heterogeneity with A Sampling Strategy

It is shown that the number of communication rounds required in FL can be reduced by up to 52% on the MNIST dataset, 32% on CIFAR-10, and 28% on FashionMNIST in comparison to the Federated Averaging algorithm.

Clustered Sampling: Low-Variance and Improved Representativity for Clients Selection in Federated Learning

This work proves that model aggregation through clustered sampling consistently leads to better training convergence and variability when compared to standard sampling approaches, and is compatible with existing methods and technologies for privacy enhancement, and for communication reduction through model compression.

MAB-based Client Selection for Federated Learning with Uncertain Resources in Mobile Networks

A multi-armed bandit (MAB)-based client selection method to solve the exploration and exploitation trade-off and reduce the time consumption for FL in mobile networks and demonstrated that the proposed scheme requires less learning time than the conventional method in the resource fluctuating scenario.

FedMCCS: Multicriteria Client Selection Model for Optimal IoT Federated Learning

FedMCCS is proposed, a multicriteria-based approach for client selection in federated learning that outperforms the other approaches by reducing the number of communication rounds to reach the intended accuracy and handling the least number of discarded rounds.

Bandit-based Communication-Efficient Client Selection Strategies for Federated Learning

This paper presents a bandit-based communication-efficient client selection strategy UCB-CS that achieves faster convergence with lower communication overhead and demonstrates how client selection can be used to improve fairness.

Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge

  • T. NishioRyo Yonetani
  • Computer Science
    ICC 2019 - 2019 IEEE International Conference on Communications (ICC)
  • 2019
The new FedCS protocol, which the authors refer to as FedCS, solves a client selection problem with resource constraints, which allows the server to aggregate as many client updates as possible and to accelerate performance improvement in ML models.

Optimal Client Sampling for Federated Learning

A novel client subsampling scheme where the number of clients allowed to communicate their updates back to the master node is restricted and a simple algorithm is provided that approximates the optimal formula for client participation which only requires secure aggregation and thus does not compromise client privacy.

Joint Optimization of Data Sampling and User Selection for Federated Learning in the Mobile Edge Computing Systems

An optimization algorithm is designed to jointly optimize the data sampling and user selection strategies, which can approach the stationary optimal solution efficiently and significantly improve the performance of federated learning in the MEC systems.