• Corpus ID: 233025377

Rejoinder: Gaussian Differential Privacy

  title={Rejoinder: Gaussian Differential Privacy},
  author={Jinshuo Dong and Aaron Roth and Weijie J. Su},
We warmly thank Editor Paul Smith for selecting our paper for discussion and are extremely grateful to all the discussants for taking their valuable time to provide engaging and stimulating feedback on our work. These insights situate our work in context and provide promising directions for future research. We are excited to see that thoughts about theoretical complements and new applications are already emerging. A general view, shared by all discussants, is that privacy is a first-order… 


Deep Learning with Differential Privacy
This work develops new algorithmic techniques for learning and a refined analysis of privacy costs within the framework of differential privacy, and demonstrates that deep neural networks can be trained with non-convex objectives, under a modest privacy budget, and at a manageable cost in software complexity, training efficiency, and model quality.
Differentially private inference via noisy optimization
This work shows that robust statistics can be used in conjunction with noisy gradient descent or noisy Newton methods in order to obtain optimal private estimators with global linear or quadratic convergence, respectively, and establishes local and global convergence guarantees.
A Central Limit Theorem for Differentially Private Query Answering
It is proved that a mechanism is approximately Gaussian Differentially Private [DRS21] if the added noise satisfies certain conditions, and the Gaussian mechanism achieves the constant-sharp optimal privacy-accuracy trade-off among all such noises.
Differentially Private Query Release Through Adaptive Projection
A new algorithm for releasing answers to very large numbers of statistical queries like k-way marginals, subject to differential privacy, makes adaptive use of a continuous relaxation of the Projection Mechanism, and outperforms existing algorithms on large query classes.
Sharp Composition Bounds for Gaussian Differential Privacy via Edgeworth Expansion
This work introduces a family of analytical and sharp privacy bounds under composition using the Edgeworth expansion in the framework of the recently proposed f-differential privacy to address a fundamental question in differential privacy regarding how the overall privacy bound degrades under composition.
New Oracle-Efficient Algorithms for Private Synthetic Data Release
Three new algorithms for constructing differentially private synthetic data are presented---a sanitized version of a sensitive dataset that approximately preserves the answers to a large collection of statistical queries that are computationally efficient when given access to an optimization oracle.
Deep Learning with Gaussian Differential Privacy
This paper derives analytically tractable expressions for the privacy guarantees of both stochastic gradient descent and Adam used in training deep neural networks, without the need of developing sophisticated techniques as [3] did.
Gaussian differential privacy
The privacy guarantees of any hypothesis testing based definition of privacy (including the original differential privacy definition) converges to GDP in the limit under composition and a Berry–Esseen style version of the central limit theorem is proved, which gives a computationally inexpensive tool for tractably analysing the exact composition of private algorithms.
PATE-GAN: Generating Synthetic Data with Differential Privacy Guarantees
This paper investigates a method for ensuring (differential) privacy of the generator of the Generative Adversarial Nets (GAN) framework, and modifies the Private Aggregation of Teacher Ensembles (PATE) framework and applies it to GANs.
Privacy-Preserving Generative Deep Neural Networks Support Clinical Data Sharing
Deep neural networks that generate synthetic participants facilitate secondary analyses and reproducible investigation of clinical datasets by enhancing data sharing while preserving participant privacy.