A Survey of Non-Exchangeable Priors for Bayesian Nonparametric Models

@article{Foti2015ASO,
  title={A Survey of Non-Exchangeable Priors for Bayesian Nonparametric Models},
  author={Nicholas J. Foti and Sinead Williamson},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  year={2015},
  volume={37},
  pages={359-371}
}
  • N. Foti, Sinead Williamson
  • Published 20 November 2012
  • Mathematics, Computer Science
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
Dependent nonparametric processes extend distributions over measures, such as the Dirichlet process and the beta process, to give distributions over collections of measures, typically indexed by values in some covariate space. Such models are appropriate priors when exchangeability assumptions do not hold, and instead we want our model to vary fluidly with some set of covariates. Since the concept of dependent nonparametric processes was formalized by MacEachern, there have been a number of… 

Figures from this paper

Distribution theory for hierarchical processes

TLDR
This paper establishes a distribution theory for hierarchical random measures that are generated via normalization, thus encompassing both the hierarchical Dirichlet and hierarchical Pitman–Yor processes, and provides a probabilistic characterization of the induced (partially exchangeable) partition structure.

Importance conditional sampling for Pitman-Yor mixtures

TLDR
This work proposes a new sampling strategy, named importance conditional sampling (ICS), which combines appealing properties of existing methods, including easy interpretability and a within-iteration parallelizable structure, and shows stable performances for different specifications of the parameters characterizing the Pitman–Yor process.

Posterior sampling from ε-approximation of normalized completely random measure mixtures

TLDR
A Bayesian nonparametric mixture model where the mixing distribution belongs to the wide class of normalized homogeneous completely random measures is adopted, and a truncation method is proposed by discarding the weights of the unnormalized measure smaller than a threshold.

Importance conditional sampling for Bayesian nonparametric mixtures

TLDR
A new sampling strategy for nonparametric mixture models based on the Pitman-Yor process, named importance conditional sampling (ICS), which combines appealing properties of existing methods, including easy interpretability and straightforward quantification of posterior uncertainty, is proposed.

Discrete Neural Processes

TLDR
Methods for efficient amortized approximate Bayesian inference over discrete combinatorial spaces, with applications to random permutations, probabilistic clustering and random communities, are developed.

Flexible clustering via hidden hierarchical Dirichlet priors

TLDR
This work investigates a nonparametric prior that arises as the composition of two different discrete random structures and derive a closed-form expression for the induced distribution of the random partition, the fundamental tool regulating the clustering behavior of the model.

Flexible online multivariate regression with variational Bayes and the matrix-variate Dirichlet process

Flexible regression methods where interest centres on the way that the whole distribution of a response vector changes with covariates are very useful in some applications. A recently developed

Posterior Asymptotics for Boosted Hierarchical Dirichlet Process Mixtures

TLDR
By extending Schwartz’s theory to partially exchangeable sequences it is shown that posterior contraction rates are crucially protected by the relationship between the sample sizes corresponding to the different groups, which varies according to the smoothness level of the true data distributions.

A priori truncation method for posterior sampling from homogeneous normalized completely random measure mixture models

This paper adopts a Bayesian nonparametric mixture model where the mixing distribution belongs to the wide class of normalized homogeneous completely random measures. We propose a truncation method

Separate Exchangeability as Modeling Principle in Bayesian Nonparametrics

TLDR
It is argued for the use of separate exchangeability as a modeling principle in Bayesian inference, especially for nonparametric Bayesian models, and that inference under such models in some cases ignores important features of the experimental setup.

References

SHOWING 1-10 OF 88 REFERENCES

Order-Based Dependent Dirichlet Processes

TLDR
This article allows the nonparametric distribution to depend on covariates through ordering the random variables building the weights in the stick-breaking representation and derives the correlation between distributions at different covariate values.

Spatial Normalized Gamma Processes

TLDR
A simple and general framework to construct dependent DPs by marginalizing and normalizing a single gamma process over an extended space is proposed and an empirical study of convergence on a synthetic dataset is reported.

A method for combining inference across related nonparametric Bayesian models

TLDR
A Markov chain Monte Carlo scheme is developed to allow efficient implementation of full posterior inference in the given model and includes a regression at the level of the nonparametric model.

Models Beyond the Dirichlet Process

Bayesian nonparametric inference is a relatively young area of research and it has recently undergone a strong development. Most of its success can be explained by the considerable degree of

A CONSTRUCTIVE DEFINITION OF DIRICHLET PRIORS

Abstract : The parameter in a Bayesian nonparametric problem is the unknown distribution P of the observation X. A Bayesian uses a prior distribution for P, and after observing X, solves the

Bayesian density regression

TLDR
The paper considers Bayesian methods for density regression, allowing a random probability distribution to change flexibly with multiple predictors, and proposes a kernel‐based weighting scheme that incorporates weights that are dependent on the distance between subjects’ predictor values.

Bayesian inference with dependent normalized completely random measures

TLDR
This paper introduces a flexible class of dependent nonparametric priors, investigates their properties and derives a suitable sampling scheme which allows their concrete implementation, and develops a Markov Chain Monte Carlo algorithm for drawing posterior inferences.

Bayesian Nonparametrics: Hierarchical Bayesian nonparametric models with applications

TLDR
The role of hierarchical modeling in Bayesian nonparametrics is discussed, focusing on models in which the infinite-dimensional parameters are treated hierarchically, and the value of these hierarchical constructions is demonstrated in a wide range of practical applications.

Hierarchical Bayesian Nonparametric Models with Applications

TLDR
The role of hierarchical modeling in Bayesian nonparametrics is discussed, focusing on models in which the infinite-dimensional parameters are treated hierarchically, and the value of these hierarchical constructions is demonstrated in a wide range of practical applications.

Construction of Dependent Dirichlet Processes based on Poisson Processes

TLDR
A novel method that exploits the intrinsic relationship between Dirichlet and Poisson processes in order to create a Markov chain of Dirichlets suitable for use as a prior over evolving mixture models is presented.
...