Share This Author
Calibrating Noise to Sensitivity in Private Data Analysis
- C. Dwork, Frank McSherry, Kobbi Nissim, Adam D. Smith
- Computer ScienceTheory of Cryptography Conference
- 4 March 2006
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.
Mechanism Design via Differential Privacy
- Frank McSherry, Kunal Talwar
- EconomicsIEEE Annual Symposium on Foundations of Computer…
- 21 October 2007
It is shown that the recent notion of differential privacv, in addition to its own intrinsic virtue, can ensure that participants have limited effect on the outcome of the mechanism, and as a consequence have limited incentive to lie.
Our Data, Ourselves: Privacy Via Distributed Noise Generation
- C. Dwork, K. Kenthapadi, Frank McSherry, Ilya Mironov, M. Naor
- Computer ScienceInternational Conference on the Theory and…
- 28 May 2006
This work provides efficient distributed protocols for generating shares of random noise, secure against malicious participants, and introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches.
Privacy integrated queries: an extensible platform for privacy-preserving data analysis
- Frank McSherry
- Computer ScienceSIGMOD Conference
- 29 June 2009
PINQ's unconditional structural guarantees require no trust placed in the expertise or diligence of the analysts, substantially broadening the scope for design and deployment of privacy-preserving data analysis, especially by non-experts.
Naiad: a timely dataflow system
- D. Murray, Frank McSherry, R. Isaacs, M. Isard, P. Barham, M. Abadi
- Computer ScienceSymposium on Operating Systems Principles
- 3 November 2013
It is shown that many powerful high-level programming models can be built on Naiad's low-level primitives, enabling such diverse tasks as streaming data analysis, iterative machine learning, and interactive graph mining.
Practical privacy: the SuLQ framework
- A. Blum, C. Dwork, Frank McSherry, Kobbi Nissim
- Computer ScienceACM SIGACT-SIGMOD-SIGART Symposium on Principles…
- 13 June 2005
This work considers a statistical database in which a trusted administrator introduces noise to the query responses with the goal of maintaining privacy of individual database entries, and modify the privacy analysis to real-valued functions f and arbitrary row types, greatly improving the bounds on noise required for privacy.
Spectral partitioning of random graphs
- Frank McSherry
- Computer Science, MathematicsProceedings IEEE International Conference on…
- 14 October 2001
This paper shows that a simple spectral algorithm can solve all three problems above in the average case, as well as a more general problem of partitioning graphs based on edge density.
A Simple and Practical Algorithm for Differentially Private Data Release
A new algorithm for differentially private data release, based on a simple combination of the Multiplicative Weights update rule with the Exponential Mechanism, which achieves what are the best known and nearly optimal theoretical guarantees while being simple to implement and experimentally more accurate on actual data sets than existing techniques.
On profit-maximizing envy-free pricing
- V. Guruswami, Jason D. Hartline, Anna R. Karlin, D. Kempe, C. Mathieu, Frank McSherry
- EconomicsACM-SIAM Symposium on Discrete Algorithms
- 23 January 2005
It is shown that computing envy-free prices to maximize the seller's revenue is APX-hard in both of these cases, and the corresponding mechanism design problem, in which the consumer's preferences are private values, is investigated and given a log-competitive truthful mechanism.
Differentially private recommender systems: building privacy into the net
This work considers the problem of producing recommendations from collective user behavior while simultaneously providing guarantees of privacy for these users, and finds that several of the leading approaches in the Netflix Prize competition can be adapted to provide differential privacy, without significantly degrading their accuracy.