• Corpus ID: 235683128

Estimating Gaussian mixtures using sparse polynomial moment systems

@inproceedings{Lindberg2021EstimatingGM,
  title={Estimating Gaussian mixtures using sparse polynomial moment systems},
  author={Julia Lindberg and Carlos Am'endola and Jose Israel Rodriguez},
  year={2021}
}
The method of moments is a statistical technique for density estimation that solves a system of moment equations to estimate the parameters of an unknown distribution. A fundamental question critical to understanding identifiability asks how many moment equations are needed to get finitely many solutions and how many solutions there are. We answer this question for classes of Gaussian mixture models using the tools of polyhedral geometry. Using these results, we present an algorithm that… 

Figures and Tables from this paper

Tensor Moments of Gaussian Mixture Models: Theory and Applications

This work develops theory and numerical methods for implicit computations with moment tensors of GMMs, reducing the computational and storage costs to O(n) and O( n3), respectively, for general covariance matrices, and for diagonal ones.

Certifying zeros of polynomial systems using interval arithmetic

The software HomotopyContinuation.jl now has a built-in function certify, which proves the correctness of an isolated solution to a square system of polynomial equations, which dramatically outperforms earlier approaches to certification.

References

SHOWING 1-10 OF 66 REFERENCES

Efficiently learning mixtures of two Gaussians

This work provides a polynomial-time algorithm for this problem for the case of two Gaussians in $n$ dimensions (even if they overlap), with provably minimal assumptions on theGaussians, and polynometric data requirements, and efficiently performs near-optimal clustering.

Maximum Likelihood Estimates for Gaussian Mixtures Are Transcendental

This work examines Gaussian mixture models through the lens of algebraic statistics, and finds that the critical points of the likelihood function are transcendental, and there is no bound on their number, even for mixtures of two univariate Gaussians.

Settling the Polynomial Learnability of Mixtures of Gaussians

  • Ankur MoitraG. Valiant
  • Computer Science
    2010 IEEE 51st Annual Symposium on Foundations of Computer Science
  • 2010
This paper gives the first polynomial time algorithm for proper density estimation for mixtures of k Gaussians that needs no assumptions on the mixture, and proves that such a dependence is necessary.

Moment Varieties of Gaussian Mixtures

The points of a moment variety are the vectors of all moments up to some order of a family of probability distributions. We study this variety for mixtures of Gaussians. Following up on Pearson's

IDENTIFYING THE NUMBER OF COMPONENTS IN GAUSSIAN MIXTURE MODELS USING NUMERICAL ALGEBRAIC GEOMETRY.

The proposed algorithms are more robust than AIC and BIC when the Gaussian assumption is violated and compared with that of Akaike information criterion and Bayesian information criterion, which are popular methods in the literature.

Ten Steps of EM Suffice for Mixtures of Two Gaussians

This work shows that the population version of EM, where the algorithm is given access to infinitely many samples from the mixture, converges geometrically to the correct mean vectors, and provides simple, closed-form expressions for the convergence rate.

Polynomial Learning of Distribution Families

It is shown that parameters of a Gaussian mixture distribution with fixed number of components can be learned using a sample whose size is polynomial in dimension and all other parameters.

Moment-Based Learning of Mixture Distributions SPUR Final Paper , Summer 2016

We study the problem of learning the parameters of a mixture of members of a given distribution family. To do this, we apply the method of moments, dating to Pearson in the late 1800’s: we directly

Settling the robust learnability of mixtures of Gaussians

This work gives the first provably robust algorithm for learning mixtures of any constant number of Gaussians, a new method for proving dimension-independent polynomial identifiability through applying a carefully chosen sequence of differential operations to certain generating functions.

Learning mixtures of arbitrary gaussians

This paper presents the first algorithm that provably learns the component gaussians in time that is polynomial in the dimension.
...