• Corpus ID: 237563268

Reinforcement Learning on Encrypted Data

  title={Reinforcement Learning on Encrypted Data},
  author={Alberto Jesu and Victor-Alexandru Darvariu and Alessandro Staffolani and Rebecca Montanari and Mirco Musolesi},
The growing number of applications of Reinforcement Learning (RL) in real-world domains has led to the development of privacy-preserving techniques due to the inherently sensitive nature of data. Most existing works focus on differential privacy, in which information is revealed in the clear to an agent whose learned model should be robust against information leakage to malicious third parties. Motivated by use cases in which only encrypted data might be shared, such as information from… 

Figures and Tables from this paper



ML Confidential: Machine Learning on Encrypted Data

A new class of machine learning algorithms in which the algorithm's predictions can be expressed as polynomials of bounded degree, and confidential algorithms for binary classification based on polynomial approximations to least-squares solutions obtained by a small number of gradient descent steps are proposed.

How You Act Tells a Lot: Privacy-Leakage Attack on Deep Reinforcement Learning

This is the first work to investigate privacy leakage in DRL settings and it is shown that DRL-based agents do potentially leak privacy-sensitive information from the trained policies.

Privacy-Preserving Classification on Deep Neural Network

This work successfully addresses the problem of privacy preserving matching open in deeper NNs by combining the original ideas of Cryptonets’ solution with the batch normalization principle introduced at ICML 2015 by Ioffe and Szegedy.

Privacy-preserving Q-Learning with Functional Noise in Continuous State Spaces

This work considers differentially private algorithms for reinforcement learning in continuous spaces, such that neighboring reward functions are indistinguishable, and shows rigorous privacy guarantees by a series of analyses on the kernel of the noise space, the probabilistic bound of such noise samples, and the composition over the iterations.

Pyfhel: PYthon For Homomorphic Encryption Libraries

This paper presents Pyfhel, wrapping existing FHE implementations in Python, providing one-click installation and convenience in addition to a significantly higher-level API, and highlights how its unique support for accessing low-level features through a high- level API makes it an ideal teaching tool for lectures on FHE.

CryptoDL: Deep Neural Networks over Encrypted Data

New techniques to adopt deep neural networks within the practical limitation of current homomorphic encryption schemes are developed and show that CryptoDL provides efficient, accurate and scalable privacy-preserving predictions.

Calibrating Noise to Sensitivity in Private Data Analysis

The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.

(Leveled) fully homomorphic encryption without bootstrapping

A novel approach to fully homomorphic encryption (FHE) that dramatically improves performance and bases security on weaker assumptions, using some new techniques recently introduced by Brakerski and Vaikuntanathan (FOCS 2011).

Fully Homomorphic Encryption from Ring-LWE and Security for Key Dependent Messages

A somewhat homomorphic encryption scheme that is both very simple to describe and analyze, and whose security reduces to the worst-case hardness of problems on ideal lattices using the RLWE assumption, which allows us to completely abstract out the lattice interpretation.

Better Bootstrapping in Fully Homomorphic Encryption

A simpler approach that bypasses the homomorphic modular-reduction bottleneck to some extent, by working with a modulus very close to a power of two, and allows to store the encryption of the secret key as a single ciphertext, thus reducing the size of the public key.