Sampling Truncated Normal, Beta, and Gamma Densities

@article{Damien2001SamplingTN,
  title={Sampling Truncated Normal, Beta, and Gamma Densities},
  author={Paul Damien and Stephen G. Walker},
  journal={Journal of Computational and Graphical Statistics},
  year={2001},
  volume={10},
  pages={206 - 215}
}
  • P. DamienS. Walker
  • Published 1 June 2001
  • Computer Science, Mathematics
  • Journal of Computational and Graphical Statistics
We consider the Bayesian analysis of constrained parameter and truncated data problems within a Gibbs sampling framework and concentrate on sampling truncated densities that arise as full conditional densities within the context of the Gibbs sampler. In particular, we restrict attention to the normal, beta, and gamma densities. We demonstrate that, in many instances, it is possible to introduce a latent variable which facilitates an easy solution to the problem. We also discuss a novel approach… 

Sampling from a multivariate Gaussian distribution truncated on a simplex: A review

This paper reviews recent Monte Carlo methods for sampling from multivariate Gaussian distributions restricted to the standard simplex and describes and analyzes two Hamiltonian Monte Carlo Methods.

The Soft Multivariate Truncated Normal Distribution with Applications to Bayesian Constrained Estimation

The soft tMVN distribution can be used to approximate simulations from a multivariate truncated normal distribution with linear constraints, or itself as a prior in shape-constrained problems.

Optimal direction Gibbs sampler for truncated multivariate normal distributions

The authors consider that optimal directions will be those that minimize the Kullback–Leibler divergence of two Markov chain Monte Carlo steps, and propose two distributions over direction for the multivariate Normal objective function.

Gibbs sampling method for the Bayesian adaptive elastic net

A Gibbs sampling algorithm for the adaptive elastic net estimator for regularized mean regression from a Bayesian perspective is developed, and it is shown that the mixture representation provides a Gibbs sampler that can be accomplished by sampling from either truncated normal or truncated Gamma distribution.

Efficient sampling methods for truncated multivariate normal and student-t distributions subject to linear inequality constraints

Sampling from a truncated multivariate distribution subject to multiple linear inequality constraints is a recurring problem in many areas in statistics and econometrics, such as the order-restricted

Sampling Some Truncated Distributions Via Rejection Algorithms

Rejection sampling algorithms to sample from some truncated and tail distributions, including multivariate normal distributions truncated to certain sets, are developed.

Univariate Bayesian nonparametric mixture modeling with unimodal kernels

The intention is to use a family of univariate distribution functions, to replace the normal, for which the only constraint is unimodality, and devise a new family of nonparametric unimodal distributions, which has large support over the space of undimensional unimodAL distributions.

Efficient Bayesian shape-restricted function estimation with constrained Gaussian process priors

This article revisits the problem of Bayesian shape-restricted inference in the light of a recently developed approximate Gaussian process that admits an equivalent formulation of the shape

The Soft Multivariate Truncated Normal Distribution

The soft tMVN distribution can be used to approximate simulations from a multivariate truncated normal distribution with linear constraints, or itself as a prior in shape-constrained problems.
...

References

SHOWING 1-10 OF 14 REFERENCES

Gibbs sampling for Bayesian non‐conjugate and hierarchical models by using auxiliary variables

The aim of the paper is to provide an alternative sampling algorithm to rejection‐based methods and other sampling approaches such as the Metropolis–Hastings algorithm.

Monte Carlo methods for approximating a posterior hazard rate process

Here it is shown how a full Bayesian posterior computation is made possible by novel Monte Carlo methods that approximate random increments of the posterior process.

Bayesian Analysis of Constrained Parameter and Truncated Data Problems

This paper illustrates how the Gibbs sampler approach to Bayesian calculation avoids these difficulties and leads to straightforwardly implemented procedures, even for apparently very complicated model forms.

Markov Chains for Exploring Posterior Distributions

Several Markov chain methods are available for sampling from a posterior distribution. Two important examples are the Gibbs sampler and the Metropolis algorithm. In addition, several strategies are

Bayesian Analysis of Linear and Non‐Linear Population Models by Using the Gibbs Sampler

Abstract : A fully Bayesian analysis of linear and nonlinear population models has previously been unavailable, as a consequence of the seeming impossibility of performing the necessary numerical

Auxiliary Variable Methods for Markov Chain Monte Carlo with Applications

Two applications in Bayesian image analysis are considered: a binary classification problem in which partial decoupling out performs Swendsen-Wang and single-site Metropolis methods, and a positron emission tomography reconstruction that uses the gray level prior of Geman and McClure.

Bayesian computation via the gibbs sampler and related markov chain monte carlo methods (with discus

The use of the Gibbs sampler for Bayesian computation is reviewed and illustrated in the context of some canonical examples. Other Markov chain Monte Carlo simulation methods are also briefly

Convergence of Markov chain Monte Carlo algorithms with applications to image restoration

Markov chain Monte Carlo algorithms, such as the Gibbs sampler and Metropolis-Hastings algorithm, are widely used in statistics, computer science, chemistry and physics for exploring complicated

Non-Uniform Random Variate Generation

This chapter reviews the main methods for generating random variables, vectors and processes in non-uniform random variate generation, and provides information on the expected time complexity of various algorithms before addressing modern topics such as indirectly specified distributions, random processes, and Markov chain methods.

Simulation of truncated normal variables

We provide simulation algorithms for one-sided and two-sided truncated normal distributions. These algorithms are then used to simulate multivariate normal variables with convex restricted parameter