A Riemannian Newton Trust-Region Method for Fitting Gaussian Mixture Models

  title={A Riemannian Newton Trust-Region Method for Fitting Gaussian Mixture Models},
  author={Lena Sembach and Jan Pablo Burgard and Volker Schulz},
  journal={Stat. Comput.},
Gaussian Mixture Models are a powerful tool in Data Science and Statistics that are mainly used for clustering and density approximation. The task of estimating the model parameters is in practice often solved by the expectation maximization (EM) algorithm which has its benefits in its simplicity and low per-iteration costs. However, the EM converges slowly if there is a large share of hidden information or overlapping clusters. Recent advances in Manifold Optimization for Gaussian Mixture… 



Matrix Manifold Optimization for Gaussian Mixtures

This work advances Riemannian manifold optimization (on the manifold of positive definite matrices) as a potential replacement for Expectation Maximization (EM) and develops a well-tuned Riemansian LBFGS method that proves superior to known competing methods (e.g., Riemanian conjugate gradient).

Optimization Algorithms on Matrix Manifolds

Optimization Algorithms on Matrix Manifolds offers techniques with broad applications in linear algebra, signal processing, data mining, computer vision, and statistical analysis and will be of interest to applied mathematicians, engineers, and computer scientists.

Learning mixtures of Gaussians

  • S. Dasgupta
  • Computer Science
    40th Annual Symposium on Foundations of Computer Science (Cat. No.99CB37039)
  • 1999
This work presents the first provably correct algorithm for learning a mixture of Gaussians, which returns the true centers of the Gaussian to within the precision specified by the user with high probability.

Automatic Preconditioning by Limited Memory Quasi-Newton Updating

A preconditioner for the conjugate gradient method that is designed for solving systems of equations Ax=bi with different right-hand-side vectors or for solving a sequence of slowly varying systems Ak x = bk is proposed.

Predicting CO and NOxemissions from gas turbines: novel data and abenchmark PEMS

A novel PEMS dataset collected over five years from a gas turbine for the predictive modeling of the CO and NOx emissions is introduced and a benchmark experimental procedure for comparability of future works on the data is presented.

k-means++: the advantages of careful seeding

By augmenting k-means with a very simple, randomized seeding technique, this work obtains an algorithm that is Θ(logk)-competitive with the optimal clustering.

UCI machine learning repository

  • 2017

Estimation and computations for Gaussian mixtures with uniform noise under separation constraints

  • Pietro Coretto
  • Computer Science, Mathematics
    Statistical Methods & Applications
  • 2021
It is shown how the methods developed in this paper are useful for several fundamental data analysis tasks: outlier identification, robust location-scale estimation, clustering, and density estimation.