Ternary Compression for Communication-Efficient Federated Learning

@article{Xu2020TernaryCF,
  title={Ternary Compression for Communication-Efficient Federated Learning},
  author={Jinjin Xu and Wenli Du and Ran Cheng and Wangli He and Yaochu Jin},
  journal={IEEE transactions on neural networks and learning systems},
  year={2020},
  volume={PP}
}
  • Jinjin Xu, W. Du, +2 authors Yaochu Jin
  • Published 7 March 2020
  • Medicine, Computer Science, Mathematics
  • IEEE transactions on neural networks and learning systems
Learning over massive data stored in different locations is essential in many real-world applications. However, sharing data is full of challenges due to the increasing demands of privacy and security with the growing use of smart mobile devices and Internet of thing (IoT) devices. Federated learning provides a potential solution to privacy-preserving and secure machine learning, by means of jointly training a global model without uploading data distributed on multiple devices to a central… 
Communication-Efficient Federated Learning With Binary Neural Networks
TLDR
A novel FL framework of training BNNs is introduced, where the clients only upload the binary parameters to the server, and a novel parameter updating scheme based on the Maximum Likelihood (ML) estimation that preserves the performance of the BNN even without the availability of aggregated real-valued auxiliary parameters.
Distributed Additive Encryption and Quantization for Privacy Preserving Federated Deep Learning
TLDR
This work develops a practical, computationally efficient encryption based protocol for federated deep learning, where the key pairs are collaboratively generated without the help of a third party by quantization of the model parameters on the clients and an approximated aggregation on the server.
Sparsified Secure Aggregation for Privacy-Preserving Federated Learning
TLDR
This work proposes a lightweight gradient sparsification framework for secure aggregation, in which the server learns the aggregate of the sparsified local model updates from a large number of users, but without learning the individual parameters.
Communication-Efficient Federated Distillation
TLDR
Compressed Federated Distillation (CFD) is investigated from the perspective of communication efficiency by analyzing the effects of active distillation-data curation, soft-label quantization and delta-coding techniques and it is demonstrated that this method can reduce the amount of communication necessary to achieve fixed performance targets by more than two orders of magnitude.
From Distributed Machine Learning to Federated Learning: A Survey
TLDR
A functional architecture of federated learning systems and a taxonomy of related techniques are proposed and the distributed training, data communication, and security of FL systems are presented.
Benchmarking Semi-supervised Federated Learning
TLDR
A novel grouping-based model average method is proposed to improve convergence efficiency, and it is shown that this can boost performance by up to 10.79% on EMNIST, compared to the non-grouping based method.
Federated Learning on Non-IID Data: A Survey
TLDR
A detailed analysis of the influence of Non-IID data on both parametric and non-parametric machine learning models in both horizontal and vertical federated learning is provided.
FedSVRG Based Communication Efficient Scheme for Federated Learning in MEC Networks
TLDR
This work proposes a federated stochastic variance reduced gradient based method to decrease the number of iterations between the participants and server from the system perspective, and guarantee the accuracy at the same time.
Reward-Based 1-bit Compressed Federated Distillation on Blockchain
TLDR
The Peer Truth Serum for Crowdsourcing mechanism for FD is modified to reward honest participation based on peer consistency in an incentive compatible fashion and the framework is a fully on-blockchain FL system that is feasible on simple smart contracts and therefore blockchain agnostic.
CFD: Communication-Efficient Federated Distillation via Soft-Label Quantization and Delta Coding
TLDR
This work investigates FD from the perspective of communication efficiency by analyzing the effects of active distillation-data curation, soft-label quantization, and delta-coding techniques, and presents Compressed Federated Distillation (CFD), an efficient Federateddistillation method.
...
1
2
...

References

SHOWING 1-10 OF 42 REFERENCES
Robust and Communication-Efficient Federated Learning From Non-i.i.d. Data
TLDR
Sparse ternary compression (STC) is proposed, a new compression framework that is specifically designed to meet the requirements of the federated learning environment and advocate for a paradigm shift in federated optimization toward high-frequency low-bitwidth communication, in particular in the bandwidth-constrained learning environments.
Federated Learning with Non-IID Data
TLDR
This work presents a strategy to improve training on non-IID data by creating a small subset of data which is globally shared between all the edge devices, and shows that accuracy can be increased by 30% for the CIFAR-10 dataset with only 5% globally shared data.
Federated Learning in Mobile Edge Networks: A Comprehensive Survey
TLDR
In a large-scale and complex mobile edge network, heterogeneous devices with varying constraints are involved, this raises challenges of communication costs, resource allocation, and privacy and security in the implementation of FL at scale.
Federated Optimization: Distributed Machine Learning for On-Device Intelligence
We introduce a new and increasingly relevant setting for distributed optimization in machine learning, where the data defining the optimization are unevenly distributed over an extremely large number
Multi-Objective Evolutionary Federated Learning
  • Hangyu Zhu, Yaochu Jin
  • Computer Science, Mathematics
    IEEE Transactions on Neural Networks and Learning Systems
  • 2020
TLDR
Experimental results indicate that the proposed optimization method is able to find optimized neural network models that can not only significantly reduce communication costs but also improve the learning performance of federated learning compared with the standard fully connected neural networks.
Adaptive Federated Learning in Resource Constrained Edge Computing Systems
TLDR
This paper analyzes the convergence bound of distributed gradient descent from a theoretical point of view, and proposes a control algorithm that determines the best tradeoff between local update and global parameter aggregation to minimize the loss function under a given resource budget.
Federated Machine Learning
TLDR
This work introduces a comprehensive secure federated-learning framework, which includes horizontal federated learning, vertical federatedLearning, and federated transfer learning, and provides a comprehensive survey of existing works on this subject.
A Survey on Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection
  • Q. Li, Zeyi Wen, B. He
  • Computer Science, Mathematics
    IEEE Transactions on Knowledge and Data Engineering
  • 2021
TLDR
A comprehensive review on federated learning systems is conducted and a thorough categorization is provided according to six different aspects, including data distribution, machine learning model, privacy mechanism, communication architecture, scale of federation and motivation of federation.
Communication-Efficient Learning of Deep Networks from Decentralized Data
TLDR
This work presents a practical method for the federated learning of deep networks based on iterative model averaging, and conducts an extensive empirical evaluation, considering five different model architectures and four datasets.
Federated Optimization: Distributed Optimization Beyond the Datacenter
We introduce a new and increasingly relevant setting for distributed optimization in machine learning, where the data defining the optimization are distributed (unevenly) over an extremely large
...
1
2
3
4
5
...