The Infinite Gaussian Mixture Model
@inproceedings{Rasmussen1999TheIG, title={The Infinite Gaussian Mixture Model}, author={Carl E. Rasmussen}, booktitle={NIPS}, year={1999} }
In a Bayesian mixture model it is not necessary a priori to limit the number of components to be finite. In this paper an infinite Gaussian mixture model is presented which neatly sidesteps the difficult problem of finding the "right" number of mixture components. Inference in the model is done using an efficient parameter-free Markov Chain that relies entirely on Gibbs sampling.
Citations
Publications citing this paper.
SHOWING 1-10 OF 718 CITATIONS
General Bayesian inference schemes in infinite mixture models
VIEW 4 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED
Bayesian mechanisms in spatial cognition: towards real-world capable computational cognitive models of spatial memory
VIEW 21 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED
Infant directed speech is consistent with teaching
VIEW 17 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED
A random finite set model for data clustering
VIEW 11 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED
Uncertainty quantification and integration in engineering systems
VIEW 5 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED
Hierarchical Dirichlet Processes Author ( s ) :
VIEW 11 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED
Infinite Generalized Gaussian Mixture Modeling and Applications
VIEW 6 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED
Likelihood-based representation of epistemic uncertainty due to sparse point data and/or interval data
VIEW 17 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED
A Learning-Based Two-Stage Spectrum Sharing Strategy With Multiple Primary Transmit Power Levels
VIEW 4 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED
Online Fault Diagnosis in Industrial Processes Using Multimodel Exponential Discriminant Analysis Algorithm
VIEW 8 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED
FILTER CITATIONS BY YEAR
CITATION STATISTICS
122 Highly Influenced Citations
Averaged 54 Citations per year from 2017 through 2019
References
Publications referenced by this paper.
SHOWING 1-6 OF 6 REFERENCES
Mixtures of Dirichlet Processes with Applications to Bayesian Nonparametric Problems
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL
Bayesian learning for neural networks
VIEW 1 EXCERPT
Adaptive rejection sampling for Gibbs sampling
VIEW 1 EXCERPT