Share This Author
Calibrating Noise to Sensitivity in Private Data Analysis
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.
Fuzzy Extractors: How to Generate Strong Keys from Biometrics and Other Noisy Data
- Y. Dodis, R. Ostrovsky, L. Reyzin, Adam D. Smith
- Computer Science, MathematicsSIAM J. Comput.
- 2 May 2004
We provide formal definitions and efficient secure techniques for turning biometric information into keys usable for any cryptographic application, and reliably and securely authenticating…
Smooth sensitivity and sampling in private data analysis
This is the first formal analysis of the effect of instance-based noise in the context of data privacy, and shows how to do this efficiently for several different functions, including the median and the cost of the minimum spanning tree.
Private Empirical Risk Minimization: Efficient Algorithms and Tight Error Bounds
- Raef Bassily, Adam D. Smith, Abhradeep Thakurta
- Computer ScienceIEEE 55th Annual Symposium on Foundations of…
- 27 May 2014
This work provides new algorithms and matching lower bounds for differentially private convex empirical risk minimization assuming only that each data point's contribution to the loss function is Lipschitz and that the domain of optimization is bounded.
What Can We Learn Privately?
- S. Kasiviswanathan, Homin K. Lee, Kobbi Nissim, Sofya Raskhodnikova, Adam D. Smith
- Computer Science49th Annual IEEE Symposium on Foundations of…
- 6 March 2008
This work investigates learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in the contexts where aggregate information is released about a database containing sensitive information about individuals.
Local, Private, Efficient Protocols for Succinct Histograms
Efficient protocols and matching accuracy lower bounds for frequency estimation in the local model for differential privacy are given and it is shown that each user need only send 1 bit to the server in a model with public coins.
Distributed Differential Privacy via Shuffling
- Albert Cheu, Adam D. Smith, Jonathan Ullman, David Zeber, M. Zhilyaev
- Computer Science, MathematicsIACR Cryptol. ePrint Arch.
- 4 August 2018
Evidence that the power of the shuffled model lies strictly between those of the central and local models is given: for a natural restriction of the model, it is shown that shuffled protocols for a widely studied selection problem require exponentially higher sample complexity than do central-model protocols.
Privacy-preserving statistical estimation with optimal convergence rates
- Adam D. Smith
- Mathematics, Computer ScienceSTOC '11
- 6 June 2011
It is shown that for a large class of statistical estimators T and input distributions P, there is a differentially private estimator AT with the same asymptotic distribution as T, which implies that AT (X) is essentially as good as the original statistic T(X) for statistical inference, for sufficiently large samples.
Analyzing Graphs with Node Differential Privacy
A generic, efficient reduction is derived that allows us to apply any differentially private algorithm for bounded-degree graphs to an arbitrary graph, based on analyzing the smooth sensitivity of the 'naive' truncation that simply discards nodes of high degree.
Private Convex Empirical Risk Minimization and High-dimensional Regression
This work significantly extends the analysis of the “objective perturbation” algorithm of Chaudhuri et al. (2011) for convex ERM problems, and gives the best known algorithms for differentially private linear regression.