Corpus ID: 237635338

MORSE-STF: A Privacy Preserving Computation System

@article{Zhang2021MORSESTFAP,
  title={MORSE-STF: A Privacy Preserving Computation System},
  author={Qizhi Zhang and Yuan Zhao and Lichun Li and Jiaofu Zhang and Qichao Zhang and Yashun Zhou and Dong Yin and Sijun Tan and Shan Yin},
  journal={ArXiv},
  year={2021},
  volume={abs/2109.11726}
}
  • Qizhi Zhang, Yuan Zhao, +6 authors Shan Yin
  • Published 24 September 2021
  • Computer Science, Mathematics
  • ArXiv
Privacy-preserving machine learning has become a popular area of research due to the increasing concern over data privacy. One way to achieve privacy-preserving machine learning is to use secure multi-party computation, where multiple distrusting parties can perform computations on data without revealing the data itself. We present Secure-TF, a privacy-preserving machine learning framework based on MPC. Our framework is able to support widelyused machine learning models such as logistic… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 12 REFERENCES
SecureML: A System for Scalable Privacy-Preserving Machine Learning
TLDR
This paper presents new and efficient protocols for privacy preserving machine learning for linear regression, logistic regression and neural network training using the stochastic gradient descent method, and implements the first privacy preserving system for training neural networks. Expand
SecureNN: 3-Party Secure Computation for Neural Network Training
TLDR
This work provides novel three-party secure computation protocols for various NN building blocks such as matrix multiplication, convolutions, Rectified Linear Units, Maxpool, normalization and so on, which is the first system to provide any security against malicious adversaries for the secure computation of complex algorithms such as neural network inference and training. Expand
ABY3: A Mixed Protocol Framework for Machine Learning
TLDR
A general framework for privacy-preserving machine learning is designed and implemented and used to obtain new solutions for training linear regression, logistic regression and neural network models and to design variants of each building block that are secure against malicious adversaries who deviate arbitrarily. Expand
CrypTen: Secure Multi-Party Computation Meets Machine Learning
Secure multi-party computation (MPC) allows parties to perform computations on data while keeping that data private. This capability has great potential for machine-learning applications: itExpand
Secure Computation for G-Module and its Applications
TLDR
This work presents several secure computation protocols for G-module operations in the online/offline mode, and shows how to instantiate those protocols to implement many widely used secure computation primitives in privacy-preserving machine learning and data mining. Expand
CrypTFlow2: Practical 2-Party Secure Inference
TLDR
Using CrypTFlow2, the first secure inference over ImageNet-scale DNNs like ResNet50 and DenseNet121 is presented, at least an order of magnitude larger than those considered in the prior work of 2-party DNN inference. Expand
ABY2.0: Improved Mixed-Protocol Secure Two-Party Computation
TLDR
This work improves semi-honest secure two-party computation (2PC) over rings, with a focus on the efficiency of the online phase, and proposes an efficient mixed-protocol framework, outperforming the state-of-the-art 2PC framework of ABY. Expand
CrypTFlow: Secure TensorFlow Inference
TLDR
CrypTFlow, a first of its kind system that converts TensorFlow inference code into Secure Multi-party Computation (MPC) protocols at the push of a button, outperforms prior work in the area of secure inference. Expand
Private Machine Learning in TensorFlow using Secure Computation
TLDR
This work presents a framework for experimenting with secure multi-party computation directly in TensorFlow, gives an open source implementation of a state-of-the-art protocol and reports on concrete benchmarks using typical models from private machine learning. Expand
Falcon: Honest-Majority Maliciously Secure Framework for Private Deep Learning
TLDR
The experiments in the WAN setting show that over large networks and datasets, compute operations dominate the overall latency of MPC, as opposed to the communication. Expand
...
1
2
...