Share This Author
Deep Learning with Differential Privacy
- Martín Abadi, Andy Chu, Li Zhang
- Computer ScienceConference on Computer and Communications…
- 1 July 2016
This work develops new algorithmic techniques for learning and a refined analysis of privacy costs within the framework of differential privacy, and demonstrates that deep neural networks can be trained with non-convex objectives, under a modest privacy budget, and at a manageable cost in software complexity, training efficiency, and model quality.
Our Data, Ourselves: Privacy Via Distributed Noise Generation
- C. Dwork, K. Kenthapadi, Frank McSherry, Ilya Mironov, M. Naor
- Computer ScienceInternational Conference on the Theory and…
- 28 May 2006
This work provides efficient distributed protocols for generating shares of random noise, secure against malicious participants, and introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches.
Rényi Differential Privacy
- Ilya Mironov
- Computer ScienceIEEE Computer Security Foundations Symposium
- 24 February 2017
This work argues that the useful analytical tool can be used as a privacy definition, compactly and accurately representing guarantees on the tails of the privacy loss, and demonstrates that the new definition shares many important properties with the standard definition of differential privacy.
Scalable Private Learning with PATE
- Nicolas Papernot, Shuang Song, Ilya Mironov, A. Raghunathan, Kunal Talwar, Ú. Erlingsson
- Computer ScienceInternational Conference on Learning…
- 15 February 2018
This work shows how PATE can scale to learning tasks with large numbers of output classes and uncurated, imbalanced training data with errors, and introduces new noisy aggregation mechanisms for teacher ensembles that are more selective and add less noise, and prove their tighter differential-privacy guarantees.
Differentially private recommender systems: building privacy into the net
This work considers the problem of producing recommendations from collective user behavior while simultaneously providing guarantees of privacy for these users, and finds that several of the leading approaches in the Netflix Prize competition can be adapted to provide differential privacy, without significantly degrading their accuracy.
Frodo: Take off the Ring! Practical, Quantum-Secure Key Exchange from LWE
- Joppe W. Bos, Craig Costello, D. Stebila
- Computer Science, MathematicsIACR Cryptology ePrint Archive
- 24 October 2016
Despite conventional wisdom that generic lattices might be too slow and unwieldy, it is demonstrated that LWE-based key exchange is quite practical: the authors' constant time implementation requires around 1.3ms computation time for each party; compared to the recent NewHope R-LWE scheme, communication sizes increase by a factor of 4.7x, but remain under 12 KiB in each direction.
Prochlo: Strong Privacy for Analytics in the Crowd
- Andrea Bittau, Ú. Erlingsson, B. Seefeld
- Computer ScienceSymposium on Operating Systems Principles
- 2 October 2017
A principled systems architecture---Encode, Shuffle, Analyze (ESA), which extends existing best-practice methods for sensitive-data analytics, by using cryptography and statistical techniques to make explicit how data is elided and reduced in precision, how only common-enough, anonymous data is analyzed, and how this is done for specific, permitted purposes.
Cache-Collision Timing Attacks Against AES
- Joseph Bonneau, Ilya Mironov
- Computer ScienceWorkshop on Cryptographic Hardware and Embedded…
- 10 October 2006
The most powerful attack has been shown under optimal conditions to reliably recover a full 128-bit AES key with 213 timing samples, an improvement of almost four orders of magnitude over the best previously published attacks of this type.
Amplification by Shuffling: From Local to Central Differential Privacy via Anonymity
- Ú. Erlingsson, V. Feldman, Ilya Mironov, A. Raghunathan, Kunal Talwar, Abhradeep Thakurta
- Computer ScienceACM-SIAM Symposium on Discrete Algorithms
- 29 November 2018
It is shown, via a new and general privacy amplification technique, that any permutation-invariant algorithm satisfying e-local differential privacy will satisfy [MATH HERE]-central differential privacy.
Uncheatable Distributed Computations
This paper proposes security schemes that defend against cheating by ensuring that it does not pay off, while stronger schemes let participants prove that they have done most of the work they were assigned with high probability.