Adaptive Rejection Metropolis Sampling Within Gibbs Sampling

@article{Gilks1995AdaptiveRM,
  title={Adaptive Rejection Metropolis Sampling Within Gibbs Sampling},
  author={Walter R. Gilks and Nicky Best and K. K. C. Tan},
  journal={Journal of The Royal Statistical Society Series C-applied Statistics},
  year={1995},
  volume={44},
  pages={455-472}
}
  • W. GilksN. BestK. Tan
  • Published 1 December 1995
  • Computer Science
  • Journal of The Royal Statistical Society Series C-applied Statistics
Gibbs sampling is a powerful technique for statistical inference. It involves little more than sampling from full conditional distributions, which can be both complex and computationally expensive to evaluate. Gilks and Wild have shown that in practice full conditionals are often log‐concave, and they proposed a method of adaptive rejection sampling for efficiently sampling from univariate log‐concave distributions. In this paper, to deal with non‐log‐concave full conditional distributions, we… 

Independent Doubly Adaptive Rejection Metropolis Sampling Within Gibbs Sampling

An alternative adaptive MCMC algorithm (IA2RMS) is proposed that overcomes an important drawback of the Adaptive Rejection Metropolis Sampling technique, speeding up the convergence of the chain to the target, allowing us to simplify the construction of the sequence of proposals, and thus reducing the computational cost of the entire algorithm.

The Recycling Gibbs sampler for efficient learning

Markov chain Monte Carlo methods in biostatistics

Concerns with implementation should not deter the biostatistician from using MCMC methods, but rather help to ensure wise use of these powerful techniques.

A method for efficiently sampling from distributions with correlated dimensions.

It is demonstrated in a simulation study that the performance of the DE-MCMC algorithm is unaffected by the correlation of the target distribution, whereas conventional MCMC performs substantially worse as the correlation increases, and it is shown that theDE- MCMC algorithm can be used to efficiently fit a hierarchical version of the linear ballistic accumulator model to response time data.

A Metropolis within Gibbs sampling in relative survival

Relative survival analysis is a method which provides an estimate of the effect on survival corrected for the effect of other independent causes of death, using the natural mortality in the

Estimation of a Generalized Linear Mixed‐Effects Model with a Finite‐Support Random‐Effects Distribution via Gibbs Sampling

We discuss a Bayesian hierarchical generalized linear mixed-effects model with a finite-support random-effects distribution and show how Gibbs sampling can be used for estimating the posterior

A fast universal self-tuned sampler within Gibbs sampling

The Hastings algorithm at fifty

The majority of algorithms used in practice today involve the Hastings algorithm, which generalizes the Metropolis algorithm to allow a much broader class of proposal distributions instead of just symmetric cases.

The Recycling Gibbs sampler for efficient and fast learning

This work shows that auxiliary samples can be employed within the Gibbs estimators, improving their efficiency with no extra cost, after pointing out the relationship between the Gibbs sampler and the chain rule used for sampling purpose.
...

References

SHOWING 1-10 OF 23 REFERENCES

Inference from Iterative Simulation Using Multiple Sequences

The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.

Monte Carlo Sampling Methods Using Markov Chains and Their Applications

SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and

A maximum likelihood estimation method for random coefficient regression models

SUMMARY A method for estimating the distribution of the parameters of a random coefficient regression model is proposed. This distribution, accounting for interindividual variability, is assumed to

Adaptive Rejection Sampling for Gibbs Sampling

SUMMARY We propose a method for rejection sampling from any univariate log-concave probability density function. The method is adaptive: as sampling proceeds, the rejection envelope and the squeezing

Spatial Statistics and Bayesian Computation

The early development of MCMC in Bayesian inference is traced, some recent computational progress in statistical physics is reviewed, based on the introduction of auxiliary variables, and its current and future relevance in Bayesesian applications are discussed.

Population pharmacokinetic data and parameter estimation based on their first two statistical moments.

  • S. Beal
  • Biology
    Drug metabolism reviews
  • 1984
From the limited evidence in this investigation it appears that the linearization per se does not significantly adversely affect the estimates, and an investigation of the effect of this linearization is reported.

Exploring Posterior Distributions Using Markov Chains

Abstract : Several Markov chain-based methods are available for sampling from a posterior distribution. Two important examples are the Gibbs sampler and the Metropolis algorithm. In addition, several

Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images

  • S. GemanD. Geman
  • Physics
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 1984
The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.

Bayesian Inference for Generalized Linear and Proportional Hazards Models Via Gibbs Sampling

It is shown that Gibbs sampling, making systematic use of an adaptive rejection algorithm proposed by Gilks and Wild, provides a straightforward computational procedure for Bayesian inferences in a

Explaining the Gibbs Sampler

A simple explanation of how and why the Gibbs sampler works is given and analytically establish its properties in a simple case and insight is provided for more complicated cases.