# Privately Learning Mixtures of Axis-Aligned Gaussians

@inproceedings{AdenAli2021PrivatelyLM, title={Privately Learning Mixtures of Axis-Aligned Gaussians}, author={Ishaq Aden-Ali and Hassan Ashtiani and Christopher Liaw}, booktitle={NeurIPS}, year={2021} }

We consider the problem of learning mixtures of Gaussians under the constraint of approximate differential privacy. We prove that Õ(kd log(1/δ)/αε) samples are sufficient to learn a mixture of k axis-aligned Gaussians in R to within total variation distance α while satisfying (ε, δ)-differential privacy. This is the first result for privately learning mixtures of unbounded axis-aligned (or even unbounded univariate) Gaussians. If the covariance matrices of each of the Gaussians is the identity…

## 4 Citations

### Private and polynomial time algorithms for learning Gaussians and beyond

- Computer ScienceCOLT
- 2022

A polynomial time and ( ε, δ ) -DP algorithm for learning (unrestricted) Gaussian distributions in R d is given and the sample complexity of the approach for learning the Gaussian up to total variation distance α is matching to logarithmic factors.

### New Lower Bounds for Private Estimation and a Generalized Fingerprinting Lemma

- Computer Science, MathematicsArXiv
- 2022

New lower bounds for statistical estimation tasks under the constraint of p ε, δ q diﬀerential privacy are proved and a tight Ω ` d α 2 ε ˘ lower bound for estimating the mean of a distribution with bounded covariance to α -error in ℓ 2 -distance is shown.

### Efficient mean estimation with pure differential privacy via a sum-of-squares exponential mechanism

- Computer ScienceSTOC
- 2022

This work gives the first polynomial-time algorithm to estimate the mean of a d-variate probability distribution with bounded covariance from Õ(d) independent samples subject to pure differential privacy, and proves a meta-theorem capturing this phenomenon.

### A Private and Computationally-Efficient Estimator for Unbounded Gaussians

- Computer ScienceCOLT
- 2022

The primary new technical tool in the algorithm is a new differentially private preconditioner that takes samples from an arbitrary Gaussian N and returns a matrix A such that A Σ A T has constant condition number.

## References

SHOWING 1-10 OF 80 REFERENCES

### List-decodable robust mean estimation and learning mixtures of spherical gaussians

- Computer ScienceSTOC
- 2018

The problem of list-decodable (robust) Gaussian mean estimation and the related problem of learning mixtures of separated spherical Gaussians are studied and a set of techniques that yield new efficient algorithms with significantly improved guarantees are developed.

### Differentially Private Algorithms for Learning Mixtures of Separated Gaussians

- Computer Science2020 Information Theory and Applications Workshop (ITA)
- 2020

This work gives a differentially private analogue of the algorithm of Achlioptas and McSherry, which has two key properties not achieved by prior work: the algorithm’s sample complexity matches that of the corresponding non-private algorithm up to lower order terms in a wide range of parameters.

### On the Sample Complexity of Privately Learning Unbounded High-Dimensional Gaussians

- Mathematics, Computer ScienceALT
- 2021

These are the first finite sample upper bounds for general Gaussians which do not impose restrictions on the parameters of the distribution and are near-optimal in the case when the covariance is known to be the identity.

### Privately Learning Markov Random Fields

- Computer ScienceICML
- 2020

It is shown that only structure learning under approximate differential privacy maintains the non-private logarithmic dependence on the dimensionality of the data, while a change in either the learning goal or the privacy notion would necessitate a polynomial dependence.

### Near-Optimal-Sample Estimators for Spherical Gaussian Mixtures

- Computer Science, MathematicsNIPS
- 2014

The first sample-efficient polynomial-time estimator for high-dimensional spherical Gaussian mixtures is derived, and it is shown that any estimator requires Ω(dk/e2) samples, hence the algorithm's sample complexity is nearly optimal in the dimension.

### Locally Private Gaussian Estimation

- Computer Science, MathematicsNeurIPS
- 2019

This work provides both adaptive two-round and nonadaptive one-round solutions for locally private Gaussian estimation and partially matches these upper bounds with an information-theoretic lower bound.

### Between Pure and Approximate Differential Privacy

- Computer ScienceJ. Priv. Confidentiality
- 2016

New purely and approximately differentially private algorithms for answering arbitrary statistical queries that improve on the sample complexity of the standard Laplace and Gaussian mechanisms for achieving worst-case accuracy guarantees by a logarithmic factor are given.

### List Decodable Mean Estimation in Nearly Linear Time

- Computer Science2020 IEEE 61st Annual Symposium on Foundations of Computer Science (FOCS)
- 2020

This paper considers robust statistics in the presence of overwhelming outliers where the majority of the dataset is introduced adversarially and develops an algorithm for list decodable mean estimation in the same setting achieving up to constants the information theoretically optimal recovery, optimal sample complexity, and in nearly linear time up to polylogarithmic factors in dimension.

### On the geometry of differential privacy

- Computer Science, MathematicsSTOC '10
- 2010

The lower bound is strong enough to separate the concept of differential privacy from the notion of approximate differential privacy where an upper bound of O(√{d}/ε) can be achieved.

### Simultaneous Private Learning of Multiple Concepts

- Computer ScienceITCS
- 2016

Lower bounds are given showing that even for very simple concept classes, the sample cost of private multi-learning must grow polynomially in k, and some multi-learners are given that require fewer samples than the basic strategy.