Mixture Models With a Prior on the Number of Components

@article{Miller2015MixtureMW,
  title={Mixture Models With a Prior on the Number of Components},
  author={Jeffrey W. Miller and Matthew T. Harrison},
  journal={Journal of the American Statistical Association},
  year={2015},
  volume={113},
  pages={340 - 356}
}
ABSTRACT A natural Bayesian approach for mixture models with an unknown number of components is to take the usual finite mixture model with symmetric Dirichlet weights, and put a prior on the number of components—that is, to use a mixture of finite mixtures (MFM). The most commonly used method of inference for MFMs is reversible jump Markov chain Monte Carlo, but it can be nontrivial to design good reversible jump moves, especially in high-dimensional spaces. Meanwhile, there are samplers for… 

Finite mixture models are typically inconsistent for the number of components

R rigor is added to data-analysis folk wisdom by proving that under even the slightest model misspecification, the FMM posterior on the number of components is ultraseverely inconsistent: for any finite $k \in \mathbb{N}$, the posterior probability that the numberof components is $k$ converges to 0 in the limit of infinite data.

Bayesian mixture models (in)consistency for the number of clusters

It is shown that a post-processing algorithm introduced by Guha et al. (2021) for the Dirichlet process extends to more general models and provides a consistent method to estimate the number of components and discusses possible solutions.

IS INFINITY THAT FAR? A BAYESIAN NONPARAMETRIC PERSPECTIVE OF FINITE MIXTURE MODELS

A new class of priors is introduced: the Normalized Independent Point Process, which is based on an auxiliary variable MCMC, which allows handling the otherwise intractable posterior distribution and overcomes the challenges associated with the Reversible Jump algorithm.

Gibbs sampling for mixtures in order of appearance: the ordered allocation sampler

This work derives a sampler that is straightforward to implement for mixing distributions with tractable size-biased ordered weights and mitigates the label-switching problem in infinite mixtures.

MCMC Computations for Bayesian Mixture Models Using Repulsive Point Processes

A general framework for mixture models, when the prior of the “cluster centers” is a finite repulsive point process depending on a hyperparameter, specified by a density which may depend on an intractable normalizing constant is presented.

Variance matrix priors for Dirichlet process mixture models with Gaussian kernels

The results show that the choice of prior is critical for deriving reliable posterior inferences in problems of higher dimensionality, and the use of the DPMM in clustering is also applicable to density estimation.

Clustering consistency with Dirichlet process mixtures

This work focuses on consistency for the unknown number of clusters when the observed data are generated from a finite mixture, and considers the situation where a prior is placed on the concentration parameter of the underlying Dirichlet process.

The Pitman–Yor multinomial process for mixture modelling

Discrete nonparametric priors play a central role in a variety of Bayesian procedures, most notably when used to model latent features as in clustering, mixtures and curve fitting. They are effective

On posterior contraction of parameters and interpretability in Bayesian mixture modeling

It will be shown that the modeling choice of kernel density functions plays perhaps the most impactful roles in determining the posterior contraction rates in the misspecified situations.

General Bayesian inference schemes in infinite mixture models

This thesis shows how to overcome certain intractabilities in order to obtain analogous compact representations for the class of Poisson-Kingman priors which includes the Dirichlet and Pitman-Yor processes.
...

References

SHOWING 1-10 OF 136 REFERENCES

Inconsistency of Pitman-Yor process mixtures for the number of components

It is shown that the posterior on data from a finite mixture does not concentrate at the true number of components, and this result applies to a large class of nonparametric mixtures, including DPMs and PYMs, over a wide variety of families of component distributions.

A simple example of Dirichlet process mixture inconsistency for the number of components

An elementary proof of this inconsistency is given in what is perhaps the simplest possible setting: a DPM with normal components of unit variance, applied to data from a "mixture" with one standard normal component.

Variational inference for Dirichlet process mixtures

A variational inference algorithm forDP mixtures is presented and experiments that compare the algorithm to Gibbs sampling algorithms for DP mixtures of Gaussians and present an application to a large-scale image analysis problem are presented.

Bayesian Model Selection in Finite Mixtures by Marginal Density Decompositions

A weighted Bayes factor method for consistently estimating d that can be implemented by an iid generalized weighted Chinese restaurant (GWCR) Monte Carlo algorithm and the performance of the new GWCR model selection procedure is compared with that of the Akaike information criterion and the Bayes information criterion implemented through an EM algorithm.

Hierarchical Mixture Modeling With Normalized Inverse-Gaussian Priors

In recent years the Dirichlet process prior has experienced a great success in the context of Bayesian mixture modeling. The idea of overcoming discreteness of its realizations by exploiting it in

Slice sampling mixture models

A more efficient version of the slice sampler for Dirichlet process mixture models described by Walker allows for the fitting of infinite mixture models with a wide-range of prior specifications and considers priors defined through infinite sequences of independent positive random variables.

On Bayesian Analysis of Mixtures with an Unknown Number of Components (with discussion)

New methodology for fully Bayesian mixture analysis is developed, making use of reversible jump Markov chain Monte Carlo methods that are capable of jumping between the parameter subspaces

Bayesian analysis of finite mixture distributions using the allocation sampler

Finite mixture distributions are receiving more and more attention from statisticians in many different fields of research because they are a very flexible class of models. They are typically used

Generalized weighted Chinese restaurant processes for species sampling mixture models

The class of species sampling mixture models is introduced as an exten- sion of semiparametric models based on the Dirichlet process to models based on the general class of species sampling priors,

Hyperparameter estimation in Dirichlet process mixture models

In Bayesian density estimation and prediction using Dirichlet process mixtures of standard, exponential family distributions, the precision or total mass parameter of the mixing Dirichlet process is
...