Federated Learning Aggregation: New Robust Algorithms with Guarantees

@article{Mansour2022FederatedLA,
  title={Federated Learning Aggregation: New Robust Algorithms with Guarantees},
  author={Adnane Mansour and Gaia Carenini and Alexandre Duplessis and David Naccache},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.10864}
}
Federated Learning has been recently proposed for distributed model training at the edge. The principle of this approach is to aggregate models learned on distributed clients to obtain a new more general “average” model (FedAvg). The resulting model is then redistributed to clients for further training. To date, the most popular federated learning algorithm uses coordinate-wise averaging of the model parameters for aggregation. In this paper, we carry out a complete general mathematical… 
1 Citations

Figures from this paper

Comparative Review of the Intrusion Detection Systems Based on Federated Learning: Advantages and Open Challenges
TLDR
The architecture of the proposed intrusion detection systems and the approaches used to model data partition across the clients are analyzed, and their advantages as well as open challenges still facing them are studied.

References

SHOWING 1-10 OF 12 REFERENCES
A Federated Learning Aggregation Algorithm for Pervasive Computing: Evaluation and Comparison
TLDR
A novel aggregation algorithm, termed FedDist, is proposed, which is able to modify its model architecture by identifying dissimilarities between specific neurons amongst the clients by allowing to account for clients’ specificity without impairing generalization.
FedPAGE: A Fast Local Stochastic Gradient Method for Communication-Efficient Federated Learning
TLDR
This work proposes a new federated learning algorithm, FedPAGE, able to further reduce the communication complexity by utilizing the recent optimal PAGE method (Li et al., 2021) instead of plain SGD in FedAvg.
Client Selection in Federated Learning: Convergence Analysis and Power-of-Choice Selection Strategies
TLDR
This paper presents the first convergence analysis of federated optimization for biased client selection strategies, and quantifies how the selection bias affects convergence speed, and proposes Power-of-Choice, a communication- and computation-efficient client selection framework that can flexibly span the trade-off between convergence speed and solution bias.
Bayesian Nonparametric Federated Learning of Neural Networks
TLDR
A Bayesian nonparametric framework for federated learning with neural networks is developed that allows for a more expressive global network without additional supervision, data pooling and with as few as a single communication round.
Federated Learning: Challenges, Methods, and Future Directions
TLDR
The unique characteristics and challenges of federated learning are discussed, a broad overview of current approaches are provided, and several directions of future work that are relevant to a wide range of research communities are outlined.
Communication-Efficient Learning of Deep Networks from Decentralized Data
TLDR
This work presents a practical method for the federated learning of deep networks based on iterative model averaging, and conducts an extensive empirical evaluation, considering five different model architectures and four datasets.
Limited-memory BFGS with displacement aggregation
A displacement aggregation strategy is proposed for the curvature pairs stored in a limited-memory BFGS (a.k.a. L-BFGS) method such that the resulting (inverse) Hessian approximations are equal to
Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms
TLDR
Fashion-MNIST is intended to serve as a direct drop-in replacement for the original MNIST dataset for benchmarking machine learning algorithms, as it shares the same image size, data format and the structure of training and testing splits.
Optimization by Simulated Annealing
TLDR
A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems.
The MNIST Database of Handwritten Digit Images for Machine Learning Research [Best of the Web]
  • L. Deng
  • Computer Science
    IEEE Signal Processing Magazine
  • 2012
In this issue, “Best of the Web” presents the modified National Institute of Standards and Technology (MNIST) resources, consisting of a collection of handwritten digit images used extensively in
...
...