# On significance of the least significant bits for differential privacy

@article{Mironov2012OnSO, title={On significance of the least significant bits for differential privacy}, author={Ilya Mironov}, journal={Proceedings of the 2012 ACM conference on Computer and communications security}, year={2012} }

We describe a new type of vulnerability present in many implementations of differentially private mechanisms. In particular, all four publicly available general purpose systems for differentially private computations are susceptible to our attack. The vulnerability is based on irregularities of floating-point implementations of the privacy-preserving Laplacian mechanism. Unlike its mathematical abstraction, the textbook sampling procedure results in a porous distribution over double-precision…

## 145 Citations

### Secure Random Sampling in Differential Privacy

- Computer Science, MathematicsESORICS
- 2021

This paper presents a practical solution to the finiteprecision floating point vulnerability, where the inverse transform sampling of the Laplace distribution can itself be inverted, thus enabling an attack where the original value can be retrieved with non-negligible advantage.

### Implementing the Exponential Mechanism with Base-2 Differential Privacy

- Computer ScienceCCS
- 2020

This work examines the practicalities of implementing the exponential mechanism of McSherry and Talwar and shows that the mechanism can be implemented exactly for a rich set of utility functions and values of the privacy parameter epsilon with limited practical overhead in running time and minimal code complexity.

### Precision-based attacks and interval refining: how to break, then fix, differential privacy on finite computers

- Computer ScienceArXiv
- 2022

This paper highlights a new class of vulnerabilities, which they are called precision-based attacks, and which affect several open source libraries and is proposing a novel technique, called interval refining, which has minimal error, provable privacy, and broad applicability.

### Differential Privacy on Finite Computers

- Computer ScienceITCS
- 2018

Strict polynomial-time discrete algorithms for approximate histograms whose simultaneous accuracy matches that of the Laplace Mechanism up to constant factors, while retaining the same (pure) differential privacy guarantee are provided.

### Are We There Yet? Timing and Floating-Point Attacks on Differential Privacy Systems

- Computer ScienceIEEE Symposium on Security and Privacy
- 2022

This paper examines the Gaussian mechanism’s susceptibility to a floating-point representation attack, and demonstrates that several commonly used, state-of-the-art implementations of differential privacy are susceptible to these attacks.

### Preserving differential privacy under finite-precision semantics

- Computer ScienceTheor. Comput. Sci.
- 2016

It is shown that in general there are violations of the differential privacy property, and the conditions under which a limited (but, arguably, acceptable) variant of the property can still be guaranteed under only a minor degradation of the privacy level.

### Towards Verifiable Differentially-Private Polling

- Computer ScienceARES
- 2022

This paper follows an approach based on zero-knowledge proofs, in specific succinct non-interactive arguments of knowledge, as a verifiable computation technique to prove the correctness of a differentially private query output, and ensures the guarantees of differential privacy hold despite the limitations of ZKPs that operate on finite fields and have limited branching capabilities.

### An Algorithmic Framework For Differentially Private Data Analysis on Trusted Processors

- Computer Science, MathematicsNeurIPS
- 2019

This work proposes a framework based on trusted processors and a new definition of differential privacy called {\em Oblivious Differential Privacy}, which combines the best of both local and global models.

### Differentially private data aggregation with optimal utility

- Computer ScienceACSAC
- 2014

PrivaDA is presented, a novel design architecture for distributed differential privacy that leverages recent advances in secure multiparty computations on fixed and floating point arithmetics to overcome the previously mentioned limitations.

### The Discrete Gaussian for Differential Privacy

- Computer ScienceNeurIPS
- 2020

This work theoretically and experimentally shows that adding discrete Gaussian noise provides essentially the same privacy and accuracy guarantees as the addition of continuousGaussian noise, and presents an simple and efficient algorithm for exact sampling from this distribution.

## References

SHOWING 1-10 OF 48 REFERENCES

### Differential Privacy Under Fire

- Computer ScienceUSENIX Security Symposium
- 2011

This work presents a detailed design for one specific solution, based on a new primitive the authors call predictable transactions and a simple differentially private programming language, that is effective against remotely exploitable covert channels, at the expense of a higher query completion time.

### Noiseless Database Privacy

- Computer Science, MathematicsASIACRYPT
- 2011

This work proposes a new notion called Noiseless Privacy that provides exact answers to queries, without adding any noise whatsoever, and derives simple rules for composition under models of dynamically changing data.

### Distance makes the types grow stronger: a calculus for differential privacy

- Computer ScienceICFP '10
- 2010

This work proposes to streamline the proving of algorithms to be differentially private one at a time with a functional language whose type system automatically guarantees differential privacy, allowing the programmer to write complex privacy-safe query programs in a flexible and compositional way.

### Our Data, Ourselves: Privacy Via Distributed Noise Generation

- Computer ScienceEUROCRYPT
- 2006

This work provides efficient distributed protocols for generating shares of random noise, secure against malicious participants, and introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches.

### Differential Privacy with Imperfect Randomness

- Computer Science, MathematicsIACR Cryptol. ePrint Arch.
- 2012

A differentially private mechanism for approximating arbitrary "low sensitivity" functions that works even with randomness coming from a Santha-Vazirani source, for any $$\gamma <1$$, which provides a somewhat surprising "separation" between traditional privacy and differential privacy with respect to imperfect randomness.

### On the Impossibility of Private Key Cryptography with Weakly Random Keys

- Computer Science, MathematicsCRYPTO
- 1990

The present work begins to answer this question by establishing that a single weakly random source of either model cannot be used to obtain a secure "one-time-pad" type of cryptosystem.

### GUPT: privacy preserving data analysis made easy

- Computer ScienceSIGMOD Conference
- 2012

The design and evaluation of a new system, GUPT, that guarantees differential privacy to programs not developed with privacy in mind, makes no trust assumptions about the analysis program, and is secure to all known classes of side-channel attacks.

### On the (im)possibility of cryptography with imperfect randomness

- Computer Science, Mathematics45th Annual IEEE Symposium on Foundations of Computer Science
- 2004

It is shown that certain cryptographic tasks like bit commitment, encryption, secret sharing, zero-knowledge, non-interactive zero- knowledge, and secure two-party computation for any non-trivial junction are impossible to realize if parties have access to entropy sources with slightly less-than-perfect entropy, i.e., sources with imperfect randomness.

### Differentially-private network trace analysis

- Computer ScienceSIGCOMM '10
- 2010

It is concluded that differential privacy shows promise for a broad class of network analyses, though for some of them an approximate expression is required to keep the error-level low.

### Differential Privacy

- Computer ScienceEncyclopedia of Cryptography and Security
- 2006

A general impossibility result is given showing that a formalization of Dalenius' goal along the lines of semantic security cannot be achieved, which suggests a new measure, differential privacy, which, intuitively, captures the increased risk to one's privacy incurred by participating in a database.