Corpus ID: 220325541

BatchCrypt: Efficient Homomorphic Encryption for Cross-Silo Federated Learning

@inproceedings{Zhang2020BatchCryptEH,
  title={BatchCrypt: Efficient Homomorphic Encryption for Cross-Silo Federated Learning},
  author={Chengliang Zhang and Suyi Li and Junzhe Xia and Wei Wang and Feng Yan and Y. Liu},
  booktitle={USENIX Annual Technical Conference},
  year={2020}
}
Cross-silo federated learning (FL) enables organizations (e.g., financial or medical) to collaboratively train a machine learning model by aggregating local gradient updates from each client without sharing privacy-sensitive data. To ensure no update is revealed during aggregation, industrial FL frameworks allow clients to mask local gradient updates using additively homomorphic encryption (HE). However, this results in significant cost in computation and communication. In our characterization… Expand
Distributed Additive Encryption and Quantization for Privacy Preserving Federated Deep Learning
TLDR
This work develops a practical, computationally efficient encryption based protocol for federated deep learning, where the key pairs are collaboratively generated without the help of a third party by quantization of the model parameters on the clients and an approximated aggregation on the server. Expand
Dubhe: Towards Data Unbiasedness with Homomorphic Encryption in Federated Learning Client Selection
  • Shulai Zhang, Zirui Li, Quan Chen, Wenli Zheng, Jingwen Leng, Minyi Guo
  • Computer Science
  • ArXiv
  • 2021
Federated learning (FL) is a distributed machine learning paradigm that allows clients to collaboratively train a model over their own local data. FL promises the privacy of clients and its securityExpand
A Secure Federated Learning framework using Homomorphic Encryption and Verifiable Computing
In this paper, we present the first Federated Learning (FL) framework which is secure against both confidentiality and integrity threats from the aggregation server, in the case where the resultingExpand
PIVODL: Privacy-preserving vertical federated learning over distributed labels
  • Hangyu Zhu, Rui Wang, Yaochu Jin, K. Liang
  • Computer Science
  • ArXiv
  • 2021
Federated learning (FL) is an emerging privacy preserving machine learning protocol that allows multiple devices to collaboratively train a shared global model without revealing their private localExpand
RoFL: Attestable Robustness for Secure Federated Learning
TLDR
RoFL is presented, a secure Federated learning system that improves robustness against malicious clients through input checks on the encrypted model updates and extends Federated Learning’s secure aggregation protocol to allow expressing a variety of properties and constraints on model updates using zero-knowledge proofs. Expand
Secure Neuroimaging Analysis using Federated Learning with Homomorphic Encryption
TLDR
This work proposes a framework for secure FL using fullyhomomorphic encryption (FHE), and uses the CKKS construction, an approximate, floating point compatible scheme that benefits from ciphertext packing and rescaling. Expand
Privacy Preserving Machine Learning with Homomorphic Encryption and Federated Learning
TLDR
A multi-party privacy preserving machine learning framework, named PFMLP, based on partially homomorphic encryption and federated learning, which is all learning parties just transmitting the encrypted gradients by homomorphicryption. Expand
SAFELearn: Secure Aggregation for private FEderated Learning
TLDR
This work presents SAFELearn, a generic design for efficient private FL systems that protects against inference attacks that have to analyze individual clients’ model updates using secure aggregation, and implements and benchmark an instantiation of the generic design with secure two-party computation. Expand
SAFELearn: Secure Aggregation for private FEderated Learning (Full Version)
Federated learning (FL) is an emerging distributed machine learning paradigm which addresses critical data privacy issues in machine learning by enabling clients, using an aggregation serverExpand
Over 100x Faster Bootstrapping in Fully Homomorphic Encryption through Memory-centric Optimization with GPUs
TLDR
This work demonstrates the first GPU implementation for bootstrapping CKKS, one of the most promising FHE schemes that support arithmetic of approximate numbers, and exploits massive parallelism available in FHE to extensively utilize memory-centric optimizations. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 58 REFERENCES
Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption
TLDR
This work describes a three-party end-to-end solution in two phases ---privacy-preserving entity resolution and federated logistic regression over messages encrypted with an additively homomorphic scheme---, secure against a honest-but-curious adversary. Expand
Privacy-Preserving Deep Learning via Additively Homomorphic Encryption
TLDR
This work revisits the previous work by Shokri and Shmatikov (ACM CCS 2015) and builds an enhanced system with the following properties: no information is leaked to the server and accuracy is kept intact, compared with that of the ordinary deep learning system also over the combined dataset. Expand
Secure Model Fusion for Distributed Learning Using Partial Homomorphic Encryption
TLDR
This work proposes a secure distributed learning system that aims to utilize the additive property of partial homomorphic encryption to prevent direct exposure of the computed models to the fusion server, and proposes two optimization mechanisms for applying partial homomorphism to model parameters in order to improve the overall efficiency. Expand
(Leveled) fully homomorphic encryption without bootstrapping
TLDR
A novel approach to fully homomorphic encryption (FHE) that dramatically improves performance and bases security on weaker assumptions, using some new techniques recently introduced by Brakerski and Vaikuntanathan (FOCS 2011). Expand
Practical Secure Aggregation for Privacy-Preserving Machine Learning
TLDR
This protocol allows a server to compute the sum of large, user-held data vectors from mobile devices in a secure manner, and can be used, for example, in a federated learning setting, to aggregate user-provided model updates for a deep neural network. Expand
SecureML: A System for Scalable Privacy-Preserving Machine Learning
TLDR
This paper presents new and efficient protocols for privacy preserving machine learning for linear regression, logistic regression and neural network training using the stochastic gradient descent method, and implements the first privacy preserving system for training neural networks. Expand
SecureBoost: A Lossless Federated Learning Framework
TLDR
This paper theoretically proves that the SecureBoost framework is as accurate as other non-federated gradient tree-boosting algorithms that bring the data into one place and theoretically proves what would be required to make the protocols completely secure. Expand
Secure Federated Transfer Learning
TLDR
A new technique and framework, known as federated transfer learning (FTL), to improve statistical models under a data federation, which requires minimal modifications to the existing model structure and provides the same level of accuracy as the nonprivacy-preserving approach. Expand
Gazelle: A Low Latency Framework for Secure Neural Network Inference
TLDR
Gazelle is designed, a scalable and low-latency system for secure neural network inference, using an intricate combination of homomorphic encryption and traditional two-party computation techniques (such as garbled circuits). Expand
Efficient paillier cryptoprocessor for privacy-preserving data mining
TLDR
This study first exploits parallelism among the operations in the cryptosystem and interleaving among independent operations, then develops hardware realization of the scheme using field-programmable gate arrays and evaluates the proposed cryptoprocessor for a well-known privacy-preserving set intersection protocol. Expand
...
1
2
3
4
5
...