FEDIC: Federated Learning on Non-IID and Long-Tailed Data via Calibrated Distillation

@article{Shang2022FEDICFL,
  title={FEDIC: Federated Learning on Non-IID and Long-Tailed Data via Calibrated Distillation},
  author={Xinyi Shang and Yang Lu and Yiu-ming Cheung and Hanzi Wang},
  journal={2022 IEEE International Conference on Multimedia and Expo (ICME)},
  year={2022},
  pages={1-6}
}
  • Xinyi ShangYang Lu Hanzi Wang
  • Published 30 April 2022
  • Computer Science
  • 2022 IEEE International Conference on Multimedia and Expo (ICME)
Federated learning provides a privacy guarantee for generating good deep learning models on distributed clients with different kinds of data. Nevertheless, dealing with non-IID data is one of the most challenging problems for federated learning. Researchers have proposed a variety of methods to eliminate the negative influence of non-IIDness. However, they only focus on the non-IID data provided that the universal class distribution is balanced. In many real-world applications, the universal… 

Figures and Tables from this paper

FedICT: Federated Multi-task Distillation for Multi-access Edge Computing

FedICT direct local-global knowledge aloof is proposed during bi-directional distillation processes between clients and the server, aiming to enable multi-task clients while alleviating client drift derived from divergent optimization directions of client-side local models.

Integrating Local Real Data with Global Gradient Prototypes for Classifier Re-Balancing in Federated Long-Tailed Learning

In this work, the capacity of decoupled training in federated long-tailed learning with a sub-optimal classifier re-trained on a set of pseudo features is restricted, due to the unavailability of a global balanced dataset in FL.

References

SHOWING 1-10 OF 25 REFERENCES

Federated Learning with Non-IID Data

This work presents a strategy to improve training on non-IID data by creating a small subset of data which is globally shared between all the edge devices, and shows that accuracy can be increased by 30% for the CIFAR-10 dataset with only 5% globally shared data.

Addressing Class Imbalance in Federated Learning

A monitoring scheme that can infer the composition of training data for each FL round, and a new loss function -- Ratio Loss to mitigate the impact of the imbalance are proposed and shown to significantly outperform previous methods, while maintaining client privacy.

Fed-Focal Loss for imbalanced data classification in Federated Learning

This paper proposes to address the class imbalance by reshaping cross-entropy loss such that it down-weights the loss assigned to well-classified examples along the lines of focal loss, and uses a tunable sampling framework to improve its robustness.

Ensemble Distillation for Robust Model Fusion in Federated Learning

This work proposes ensemble distillation for model fusion, i.e. training the central classifier through unlabeled data on the outputs of the models from the clients, which allows flexible aggregation over heterogeneous client models that can differ e.g. in size, numerical precision or structure.

Heterogeneous Federated Learning Through Multi-Branch Network

A novel heterogeneous federated learning framework based on multi-branch deep neural network models which enable the selection of a proper sub-br branch model for the client devices according to their computational capabilities is proposed.

Measuring the Effects of Non-Identical Data Distribution for Federated Visual Classification

This work proposes a way to synthesize datasets with a continuous range of identicalness and provide performance measures for the Federated Averaging algorithm, and shows that performance degrades as distributions differ more, and proposes a mitigation strategy via server momentum.

Federated Optimization in Heterogeneous Networks

This work introduces a framework, FedProx, to tackle heterogeneity in federated networks, and provides convergence guarantees for this framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work.

Fedns: Improving Federated Learning for Collaborative Image Classification on Mobile Clients

  • Yaoxin ZhuoBaoxin Li
  • Computer Science
    2021 IEEE International Conference on Multimedia and Expo (ICME)
  • 2021
This paper proposes a new approach, termed Federated Node Selection (FedNS), for the server’s global model aggregation in the FL setting, which filters and re-weights the clients’ models at the node/kernel level, leading to a potentially better global model by fusing the best components of the clients.

Federated Learning: Challenges, Methods, and Future Directions

The unique characteristics and challenges of federated learning are discussed, a broad overview of current approaches are provided, and several directions of future work that are relevant to a wide range of research communities are outlined.

Decoupling Representation and Classifier for Long-Tailed Recognition

It is shown that it is possible to outperform carefully designed losses, sampling strategies, even complex modules with memory, by using a straightforward approach that decouples representation and classification.