Widespread Underestimation of Sensitivity in Differentially Private Libraries and How to Fix It

@article{Casacuberta2022WidespreadUO,
  title={Widespread Underestimation of Sensitivity in Differentially Private Libraries and How to Fix It},
  author={S{\'i}lvia Casacuberta and Michael Shoemate and Salil P. Vadhan and Connor Wagaman},
  journal={Proceedings of the 2022 ACM SIGSAC Conference on Computer and Communications Security},
  year={2022}
}
We identify a new class of vulnerabilities in implementations of differential privacy. Specifically, they arise when computing basic statistics such as sums, thanks to discrepancies between the implemented arithmetic using finite data types (namely, ints or floats) and idealized arithmetic over the reals or integers. These discrepancies cause the sensitivity of the implemented statistics (i.e., how much one individual's data can affect the result) to be much larger than the sensitivity we… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 49 REFERENCES

Universally utility-maximizing privacy mechanisms

Every potential user u, no matter what its side information and preferences, derives as much utility from M* as from interacting with a differentially private mechanism Mu that is optimally tailored to u, subject to differential privacy.

A Programming Framework for OpenDP∗†

A programming framework for the library of differentially private algorithms that will be at the core of the OpenDP open-source software project, and recommend programming languages in which to implement the framework is proposed.

Google COVID-19 Vaccination Search Insights: Anonymization Process Description

This report describes the aggregation and anonymization process applied to the COVID-19 Vaccination Search Insights, a publicly available dataset showing aggregated and anonymized trends in Google searches related to CO VID-19 vaccination.

Google COVID-19 Search Trends Symptoms Dataset: Anonymization Process Description (version 1.0)

This report describes the aggregation and anonymization process applied to the initial version of COVID-19 Search Trends symptoms dataset, a publicly available dataset that shows aggregated, anonymized trends in Google searches for symptoms (and some related topics).

Hierarchical organization of urban mobility and its connection with city livability

An anonymous and aggregated flows generated from three hundred million users, opted-in to Location History, are used to extract global Intra-urban trips and a metric of hierarchy in urban travel is introduced and correlations between levels of hierarchy and other urban indicators are found.

Diffprivlib: The IBM Differential Privacy Library

The IBM Differential Privacy Library is presented, a general purpose, open source library for investigating, experimenting and developing differential privacy applications in the Python programming language.

Chorus: a Programming Framework for Building Scalable Differential Privacy Mechanisms

The use of Chorus is demonstrated to build the first highly scalable implementations of complex mechanisms like Weighted PINQ, MWEM, and the matrix mechanism, based on cooperation between the mechanism itself and a high-performance production database management system.

Deep Learning with Differential Privacy

This work develops new algorithmic techniques for learning and a refined analysis of privacy costs within the framework of differential privacy, and demonstrates that deep neural networks can be trained with non-convex objectives, under a modest privacy budget, and at a manageable cost in software complexity, training efficiency, and model quality.

Pracniques: further remarks on reducing truncation errors

The AND and NOT operatimts are transf0rlned to multiplication and subtraction operations as described in (1) and (3).

On significance of the least significant bits for differential privacy

A new type of vulnerability present in many implementations of differentially private mechanisms is described, based on irregularities of floating-point implementations of the privacy-preserving Laplacian mechanism, which allows one to breach differential privacy with just a few queries into the mechanism.