# On Polynomial Approximations for Privacy-Preserving and Verifiable ReLU Networks

@article{Ali2020OnPA, title={On Polynomial Approximations for Privacy-Preserving and Verifiable ReLU Networks}, author={Ramy E. Ali and Jinhyun So and Amir Salman Avestimehr}, journal={ArXiv}, year={2020}, volume={abs/2011.05530} }

Outsourcing neural network inference tasks to an untrusted cloud raises data privacy and integrity concerns. In order to address these challenges, several privacy-preserving and verifiable inference techniques have been proposed based on replacing the non-polynomial activation functions such as the rectified linear unit (ReLU) function with polynomial activation functions. Such techniques usually require the polynomial coefficients to be in a finite field. Motivated by such requirements…

## 8 Citations

On Regularizing Coordinate-MLPs

- Computer ScienceArXiv
- 2022

It is found that as the bandwidth of a coordinate-MLP is enhanced, lower frequencies tend to get suppressed unless a suitable prior is provided explicitly, which can be incorporated into existing networks without any architectural modifications.

ApproxIFER: A Model-Agnostic Approach to Resilient and Robust Prediction Serving Systems

- Computer ScienceAAAI
- 2022

Approximate Coded Inference (ApproxIFER) is proposed, a different approach that does not require training any parity models, hence it is agnostic to the model hosted by the cloud and can be readily applied to different data domains and model architectures.

Private Retrieval, Computing, and Learning: Recent Progress and Future Challenges

- Computer ScienceIEEE Journal on Selected Areas in Communications
- 2022

The article motivates each privacy setting, describes the problem formulation, summarizes breakthrough results in the history of each problem, and gives recent results and discusses some of the major ideas that emerged in each field.

Fully Homomorphically Encrypted Deep Learning as a Service

- Computer ScienceMachine Learning and Knowledge Extraction
- 2021

This project investigates how FHE with deep learning can be used at scale toward accurate sequence prediction, with a relatively low time complexity, the problems that such a system incurs, and mitigations/solutions for such problems.

Adaptive Verifiable Coded Computing: Towards Fast, Secure and Private Distributed Machine Learning

- Computer Science
- 2021

This paper proposes Adaptive Verifiable Coded Computing (AVCC) framework that decouples the Byzantine node detection challenge from the straggler tolerance, and uses an orthogonal approach that leverages verifiable computing to mitigate Byzantine workers.

A Vertical Federated Learning Framework for Graph Convolutional Network

- Computer ScienceArXiv
- 2021

FedVGCN is proposed, a federated GCN learning paradigm for privacy-preserving node classification task under data vertically partitioned setting, which can be generalized to existing GCN models.

Stability and Generalization of Bilevel Programming in Hyperparameter Optimization

- Computer ScienceNeurIPS
- 2021

This paper presents an expectation bound w.r.t. the validation set based on uniform stability for the classical cross-validation algorithm, and proves that regularization terms in both the outer and inner levels can relieve the overfitting problem in gradient-based algorithms.

VeriML: Enabling Integrity Assurances and Fair Payments for Machine Learning as a Service

- Computer ScienceIEEE Transactions on Parallel and Distributed Systems
- 2021

VeriML is a novel and efficient framework to bring integrity assurances and fair payments to MLaaS, and clients can be assured that ML tasks are correctly executed on an untrusted server, and the resource consumption claimed by the service provider equals to the actual workload.

## References

SHOWING 1-10 OF 35 REFERENCES

Delphi: A Cryptographic Inference Service for Neural Networks

- Computer ScienceIACR Cryptol. ePrint Arch.
- 2020

This work designs, implements, and evaluates DELPHI, a secure prediction system that allows two parties to execute neural network inference without revealing either party’s data, and develops a hybrid cryptographic protocol that improves upon the communication and computation costs over prior work.

VeriML: Enabling Integrity Assurances and Fair Payments for Machine Learning as a Service

- Computer ScienceIEEE Transactions on Parallel and Distributed Systems
- 2021

VeriML is a novel and efficient framework to bring integrity assurances and fair payments to MLaaS, and clients can be assured that ML tasks are correctly executed on an untrusted server, and the resource consumption claimed by the service provider equals to the actual workload.

On the numerical determination of the best approximations in the Chebyshev sense

- Computer Science
- 1960

CryptoNets: applying neural networks to encrypted data with high throughput and accuracy

- Computer ScienceICML 2016
- 2016

It is shown that the cloud service is capable of applying the neural network to the encrypted data to make encrypted predictions, and also return them in encrypted form, which allows high throughput, accurate, and private predictions.

Low Latency Privacy Preserving Inference

- Computer ScienceICML
- 2019

This work applies the method of transfer learning to provide private inference services using deep networks with latency of ∼0.16 seconds and presents more than 10× improvement in latency and enable inference on wider networks compared to prior attempts with the same level of security.

SecureNets: Secure Inference of Deep Neural Networks on an Untrusted Cloud

- Computer ScienceACML
- 2018

A secure outsourcing framework for deep neural network inference called SecureNets is proposed, which can preserve both a user’s data privacy and his/her neural network model privacy, and also verify the computation results returned by the cloud.

TAPAS: Tricks to Accelerate (encrypted) Prediction As a Service

- Computer ScienceICML
- 2018

Ideas from the machine learning literature, particularly work on binarization and sparsification of neural networks, together with algorithmic tools to speed-up and parallelize computation using encrypted data are combined.

Eigenvalues of covariance matrices: Application to neural-network learning.

- MathematicsPhysical review letters
- 1991

The learing time of a simple neural-network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second-order properties of the…