• Corpus ID: 235683128

Estimating Gaussian mixtures using sparse polynomial moment systems

  title={Estimating Gaussian mixtures using sparse polynomial moment systems},
  author={Julia Lindberg and Carlos Am'endola and Jose Israel Rodriguez},
The method of moments is a statistical technique for density estimation that solves a system of moment equations to estimate the parameters of an unknown distribution. A fundamental question critical to understanding identifiability asks how many moment equations are needed to get finitely many solutions and how many solutions there are. We answer this question for classes of Gaussian mixture models using the tools of polyhedral geometry. Using these results, we present an algorithm that… 

Figures and Tables from this paper

Tensor Moments of Gaussian Mixture Models: Theory and Applications
This work develops theory and numerical methods for implicit computations with moment tensors of GMMs, reducing the computational and storage costs to O(n) and O( n3), respectively, for general covariance matrices, and for diagonal ones.


Optimal estimation of Gaussian mixtures via denoised method of moments
By proving new moment comparison theorems in the Wasserstein distance via polynomial interpolation and majorization techniques, this paper establishes the statistical guarantees and adaptive optimality of the proposed procedure, as well as oracle inequality in misspecified models.
Efficiently learning mixtures of two Gaussians
This work provides a polynomial-time algorithm for this problem for the case of two Gaussians in $n$ dimensions (even if they overlap), with provably minimal assumptions on theGaussians, and polynometric data requirements, and efficiently performs near-optimal clustering.
Maximum Likelihood Estimates for Gaussian Mixtures Are Transcendental
This work examines Gaussian mixture models through the lens of algebraic statistics, and finds that the critical points of the likelihood function are transcendental, and there is no bound on their number, even for mixtures of two univariate Gaussians.
Settling the Polynomial Learnability of Mixtures of Gaussians
  • Ankur Moitra, G. Valiant
  • Computer Science
    2010 IEEE 51st Annual Symposium on Foundations of Computer Science
  • 2010
This paper gives the first polynomial time algorithm for proper density estimation for mixtures of k Gaussians that needs no assumptions on the mixture, and proves that such a dependence is necessary.
Moment Varieties of Gaussian Mixtures
The points of a moment variety are the vectors of all moments up to some order of a family of probability distributions. We study this variety for mixtures of Gaussians. Following up on Pearson's
The proposed algorithms are more robust than AIC and BIC when the Gaussian assumption is violated and compared with that of Akaike information criterion and Bayesian information criterion, which are popular methods in the literature.
Polynomial Learning of Distribution Families
It is shown that parameters of a Gaussian mixture distribution with fixed number of components can be learned using a sample whose size is polynomial in dimension and all other parameters.
Moment-Based Learning of Mixture Distributions SPUR Final Paper , Summer 2016
We study the problem of learning the parameters of a mixture of members of a given distribution family. To do this, we apply the method of moments, dating to Pearson in the late 1800’s: we directly
Algebraic Identifiability of Gaussian Mixtures
We prove that all moment varieties of univariate Gaussian mixtures have the expected dimension. Our approach rests on intersection theory and Terracini's classification of defective surfaces. The
Settling the robust learnability of mixtures of Gaussians
This work gives the first provably robust algorithm for learning mixtures of any constant number of Gaussians, a new method for proving dimension-independent polynomial identifiability through applying a carefully chosen sequence of differential operations to certain generating functions.