Corpus ID: 44150796

SecureNN: Efficient and Private Neural Network Training

@article{Wagh2018SecureNNEA,
  title={SecureNN: Efficient and Private Neural Network Training},
  author={Sameer Wagh and Divya Gupta and Nishanth Chandran},
  journal={IACR Cryptol. ePrint Arch.},
  year={2018},
  volume={2018},
  pages={442}
}
Neural Networks (NN) provide a powerful method for machine learning training and prediction. [...] Key Method Experimentally, we build a system and train a (A) 3-layer DNN (B) 4-layer CNN from MiniONN, and (C) 4-layer LeNet network. Compared to the state-of-the-art prior work SecureML (Mohassel and Zhang, IEEE S&P 2017) that provided (computationally-secure) protocols for only the network A in the 2 and 3-party setting, we obtain 93X and 8X improvements, respectively. In the WAN setting, these improvements are…Expand
Privacy-Preserving Deep Learning with SPDZ
Neural Networks (NN) are powerful tools for supervised machine learning. However, extensive data collection from different sources for accurate training risks privacy. Most privacy-preservingExpand
S++: A Fast and Deployable Secure-Computation Framework for Privacy-Preserving Neural Network Training
TLDR
S++ is introduced, a simple, robust, and deployable framework for training a neural network (NN) using private data from multiple sources, using secret-shared secure function evaluation, and argues that it would be remiss not to extend the mechanism to non-linear functions such as the logistic sigmoid, tanh, and softmax that are fundamental due to their ability to express outputs as probabilities and their universal approximation property. Expand
Banners: Binarized Neural Networks with Replicated Secret Sharing
TLDR
This work yields security with abort against one malicious adversary for BNN by leveraging on Replicated Secret Sharing for an honest majority with three computing parties, and attest the efficiency of Banners as a privacy-preserving inference technique. Expand
Trident: Efficient 4PC Framework for Privacy Preserving Machine Learning
TLDR
This work proposes an actively secure four-party protocol (4PC), and a framework for PPML, showcasing its applications on four of the most widely-known machine learning algorithms -- Linear Regression, Logisticregression, Neural Networks, and Convolutional Neural Networks. Expand
Privacy-Preserving Deep Learning Based on Multiparty Secure Computation: A Survey
TLDR
The state-of-the-art researches in privacy-preserving DL based on multiparty secure computation with data encryption are reviewed and the techniques with respect to the linear and nonlinear computations, which are the two basic building blocks in DL are classified. Expand
CrypTen: Secure Multi-Party Computation Meets Machine Learning
Secure multi-party computation (MPC) allows parties to perform computations on data while keeping that data private. This capability has great potential for machine-learning applications: itExpand
Outsourcing Private Machine Learning via Lightweight Secure Arithmetic Computation
TLDR
This work proposes an actively secure protocol for outsourcing secure and private machine learning computations and showcases the efficiency of the protocol by applying it to real-world instances of arithmetized neural network computations, including a network trained to perform collaborative disease prediction. Expand
A Hybrid-Domain Framework for Secure Gradient Tree Boosting
TLDR
A novel framework for two parties to build secure XGB with vertically partitioneddata is proposed by associate Homomorphic Encryption domain with Secret Sharing domain by providing the two-way transformation primitives. Expand
MOBIUS: Model-Oblivious Binarized Neural Networks
A privacy-preserving framework in which a computational resource provider receives encrypted data from a client and returns prediction results without decrypting the data, i.e., oblivious neuralExpand
Secure Collaborative Deep Learning Against GAN Attacks in the Internet of Things
TLDR
This article proposes a secure collaborative deep learning model which resists GAN attacks, and targets convolutional neural networks, the most popular network, to design specific algorithms for various functionalities in different layers of the network, making it suitable for deep learning environments. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 31 REFERENCES
SecureML: A System for Scalable Privacy-Preserving Machine Learning
TLDR
This paper presents new and efficient protocols for privacy preserving machine learning for linear regression, logistic regression and neural network training using the stochastic gradient descent method, and implements the first privacy preserving system for training neural networks. Expand
Privacy-Preserving Classification on Deep Neural Network
TLDR
This work successfully addresses the problem of privacy preserving matching open in deeper NNs by combining the original ideas of Cryptonets’ solution with the batch normalization principle introduced at ICML 2015 by Ioffe and Szegedy. Expand
CryptoDL: Deep Neural Networks over Encrypted Data
TLDR
New techniques to adopt deep neural networks within the practical limitation of current homomorphic encryption schemes are developed and show that CryptoDL provides efficient, accurate and scalable privacy-preserving predictions. Expand
Gazelle: A Low Latency Framework for Secure Neural Network Inference
TLDR
Gazelle is designed, a scalable and low-latency system for secure neural network inference, using an intricate combination of homomorphic encryption and traditional two-party computation techniques (such as garbled circuits). Expand
Private Collaborative Neural Network Learning
TLDR
This work presents a feasible protocol for learning neural networks in a collaborative way while preserving the privacy of each record by combining Differential Privacy and Secure Multi-Party Computation with Machine Learning. Expand
CryptoNets: applying neural networks to encrypted data with high throughput and accuracy
TLDR
It is shown that the cloud service is capable of applying the neural network to the encrypted data to make encrypted predictions, and also return them in encrypted form, which allows high throughput, accurate, and private predictions. Expand
High-performance secure multi-party computation for data mining applications
TLDR
New protocols in the Sharemind model for secure multiplication, share conversion, equality, bit shift, bit extraction, and division are described and benchmarked, showing that the current approach provides remarkable speed improvements over the previous work. Expand
Oblivious Neural Network Predictions via MiniONN Transformations
TLDR
MiniONN is presented, the first approach for transforming an existing neural network to an oblivious neural network supporting privacy-preserving predictions with reasonable efficiency and it is shown that MiniONN outperforms existing work in terms of response latency and message sizes. Expand
Chameleon: A Hybrid Secure Computation Framework for Machine Learning Applications
TLDR
Chameleon combines the best aspects of generic SFE protocols with the ones that are based upon additive secret sharing, and improves the efficiency of mining and classification of encrypted data for algorithms based upon heavy matrix multiplications. Expand
Machine Learning Classification over Encrypted Data
TLDR
A new library of building blocks is constructed, which enables constructing a wide range of privacy-preserving classifiers and it is demonstrated how this library can be used to construct other classifiers than the three mentioned above, such as a multiplexer and a face detection classifier. Expand
...
1
2
3
4
...