Adaptive Seeding for Gaussian Mixture Models

@inproceedings{Blmer2016AdaptiveSF,
  title={Adaptive Seeding for Gaussian Mixture Models},
  author={J. Bl{\"o}mer and Kathrin Bujna},
  booktitle={PAKDD},
  year={2016}
}
We present new initialization methods for the expectation-maximization algorithm for multivariate Gaussian mixture models. Our methods are adaptions of the well-known K-means++ initialization and the Gonzalez algorithm. Thereby we aim to close the gap between simple random, e.g. uniform, and complex methods, that crucially depend on the right choice of hyperparameters. Our extensive experiments indicate the usefulness of our methods compared to common techniques and methods, which e.g. apply… Expand
Bayesian Cluster Enumeration Criterion for Unsupervised Learning
Using convolutional neural network autoencoders to understand unlabeled data
Robust and Distributed Cluster Enumeration and Object Labeling
Calibration-Free Network Localization Using Non-line-of-sight Ultra-wideband Measurements
...
1
2
...

References

SHOWING 1-10 OF 32 REFERENCES
Initializing the EM algorithm in Gaussian mixture models with an unknown number of components
Efficient Greedy Learning of Gaussian Mixture Models
EM for mixtures
Initializing EM using the properties of its trajectories in Gaussian mixtures
Learning mixtures of Gaussians
  • S. Dasgupta
  • Mathematics, Computer Science
  • 40th Annual Symposium on Foundations of Computer Science (Cat. No.99CB37039)
  • 1999
k-means++: the advantages of careful seeding
A new random approach for initialization of the multiple restart EM algorithm for Gaussian model-based clustering
  • W. Kwedlo
  • Mathematics, Computer Science
  • Pattern Analysis and Applications
  • 2014
...
1
2
3
4
...