Markov chain Monte Carlo in approximate Dirichlet and beta two-parameter process hierarchical models

  title={Markov chain Monte Carlo in approximate Dirichlet and beta two-parameter process hierarchical models},
  author={Hemant Ishwaran and Mahmoud Zarepour},
SUMMARY We present some easy-to-construct random probability measures which approximate the Dirichlet process and an extension which we will call the beta two-parameter process. The nature of these constructions makes it simple to implement Markov chain Monte Carlo algorithms for fitting nonparametric hierarchical models and mixtures of nonparametric hierarchical models. For the Dirichlet process, we consider a truncation approximation as well as a weak limit approximation based on a mixture of… 

Figures from this paper

An adaptive truncation method for inference in Bayesian nonparametric models

This paper describes an adaptive truncation method which allows the level of the truncation to be decided by the algorithm and so can avoid large errors in approximating the posterior.

Approximate Dirichlet Process Computing in Finite Normal Mixtures

A rich nonparametric analysis of the finite normal mixture model is obtained by working with a precise truncation approximation of the Dirichlet process. Model fitting is carried out by a simple

Gibbs Sampling Methods for Stick-Breaking Priors

Two general types of Gibbs samplers that can be used to fit posteriors of Bayesian hierarchical models based on stick-breaking priors are presented and the blocked Gibbs sampler, based on an entirely different approach that works by directly sampling values from the posterior of the random measure.

Slice sampling mixture models

A more efficient version of the slice sampler for Dirichlet process mixture models described by Walker allows for the fitting of infinite mixture models with a wide-range of prior specifications and considers priors defined through infinite sequences of independent positive random variables.

Markov switching Dirichlet process mixture regression

. Markov switching models can be used to study heterogeneous populations that are observed over time. This paper explores modeling the group characteristics nonparametrically, under both homogeneous

Bayesian analysis of random partition models with Laplace distribution

A random partition procedure based on a Dirichlet process prior with Laplace distribution, which provides simultaneous partitioning and parameter estimation with the computation of classification probabilities, unlike its counterparts.


Generalized linear mixed models are widely used for describing overdispersed and correlated data. Such data arise frequently in studies involving clustered and hierarchical designs. A more flexible

Bayesian mixture models (in)consistency for the number of clusters

It is shown that a post-processing algorithm introduced by Guha et al. (2021) for the Dirichlet process extends to more general models and provides a consistent method to estimate the number of components and discusses possible solutions.

Mixture Models With a Prior on the Number of Components

It turns out that many of the essential properties of DPMs are also exhibited by MFMs, and the MFM analogues are simple enough that they can be used much like the corresponding DPM properties; this simplifies the implementation of MFMs and can substantially improve mixing.

Bayesian Model Selection in Finite Mixtures by Marginal Density Decompositions

A weighted Bayes factor method for consistently estimating d that can be implemented by an iid generalized weighted Chinese restaurant (GWCR) Monte Carlo algorithm and the performance of the new GWCR model selection procedure is compared with that of the Akaike information criterion and the Bayes information criterion implemented through an EM algorithm.



Estimating mixture of dirichlet process models

A conceptual framework for computational strategies is proposed that provides a perspective on current methods, facilitates comparisons between them, and leads to several new methods that expand the scope of MDP models to nonconjugate situations.

Computing Nonparametric Hierarchical Models

The ease with which the strict parametric assumptions common to most standard Bayesian hierarchical models can be relaxed to incorporate uncertainties about functional forms using Dirichlet process components is illustrated, partly enabled by the approach to computation using MCMC methods.

Mixtures of Dirichlet Processes with Applications to Bayesian Nonparametric Problems

process. This paper extends Ferguson's result to cases where the random measure is a mixing distribution for a parameter which determines the distribution from which observations are made. The

Estimating normal means with a conjugate style dirichlet process prior

The problem of estimating many normal means is approached by means of an hierarchical model. The hierarchical model is the standard conjugate model with one exception: the normal distribution at the

The two-parameter Poisson-Dirichlet distribution derived from a stable subordinator

The two-parameter Poisson-Dirichlet distribution, denoted PD(α,θ), is a probability distribution on the set of decreasing positive sequences with sum 1. The usual Poisson-Dirichlet distribution with

Bayesian Density Estimation and Inference Using Mixtures

Abstract We describe and illustrate Bayesian inference in models for density estimation using mixtures of Dirichlet processes. These models provide natural settings for density estimation and are

Applications of Hybrid Monte Carlo to Bayesian Generalized Linear Models: Quasicomplete Separation and Neural Networks

The “leapfrog” hybrid Monte Carlo algorithm is a simple and effective MCMC method for fitting Bayesian generalized linear models with canonical link, having superior performance over conventional methods in difficult problems like logistic regression with quasicomplete separation.


Abstract : The parameter in a Bayesian nonparametric problem is the unknown distribution P of the observation X. A Bayesian uses a prior distribution for P, and after observing X, solves the

Bayesian curve fitting using multivariate normal mixtures

SUMMARY Problems of regression smoothing and curve fitting are addressed via predictive inference in a flexible class of mixture models. Multidimensional density estimation using Dirichlet mixture

Computing Posterior Distributions for Covariance Matrices

Diiculties in computing the posterior distribution of a covariance matrix when using nonconjugate priors has been discussed by several authors. Typically, the posterior distribution for the