A Gibbs Sampler for a Class of Random Convex Polytopes

@article{Jacob2021AGS,
  title={A Gibbs Sampler for a Class of Random Convex Polytopes},
  author={Pierre E. Jacob and Ruobin Gong and Paul Thatcher Edlefsen and Arthur P. Dempster},
  journal={Journal of the American Statistical Association},
  year={2021},
  volume={116},
  pages={1181 - 1192}
}
Abstract We present a Gibbs sampler for the Dempster–Shafer (DS) approach to statistical inference for categorical distributions. The DS framework extends the Bayesian approach, allows in particular the use of partial prior information, and yields three-valued uncertainty assessments representing probabilities “for,” “against,” and “don’t know” about formal assertions of interest. The proposed algorithm targets the distribution of a class of random convex polytopes which encapsulate the DS… Expand
Comment on “A Gibbs Sampler for a Class of Random Convex Polytopes”
Discussion of “A Gibbs Sampler for a Class of Random Convex Polytopes”
  • P. Diaconis, Guanyang Wang
  • Mathematics
  • Journal of the American Statistical Association
  • 2021
The paper offers answers to these questions by proposing a Gibbs sampler to perform statistical inference for categorical distributions using the Dempster-Shafer approach. To be precise, let x = (xi)Expand
Rejoinder: Let’s Be Imprecise in Order to Be Precise (About What We Don’t Know)
Preparing a rejoinder is a typically rewarding, sometimes depressing, and occasionally frustrating experience. The rewarding part is self-evident, and the depression sets in when a discussant hasExpand

References

SHOWING 1-10 OF 114 REFERENCES
A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria, 2018
  • URL https://www.R-project.org/
  • 2018
A gibbs sampler for a class of random convex polytopes (with discussion)
  • Journal of the American Statistical Association,
  • 2021
Gradient Estimation with Stochastic Softmax Tricks
TLDR
Stochastic softmax tricks can be used to train latent variable models that perform better and discover more latent structure and this framework is a unified perspective on existing relaxed estimators for perturbation models, and it contains many novel relaxations. Expand
Unbiased Markov chain Monte Carlo methods with couplings
Markov chain Monte Carlo (MCMC) methods provide consistent approximations of integrals as the number of iterations goes to 1. MCMC estimators are generally biased after any fixed number ofExpand
  • Proceedings of Machine Learning Research
  • 2019
  • 2019
Counterfactual Off-Policy Evaluation with Gumbel-Max Structural Causal Models
TLDR
An off-policy evaluation procedure for highlighting episodes where applying a reinforcement learned policy is likely to have produced a substantially different outcome than the observed policy, and a class of structural causal models for generating counterfactual trajectories in finite partially observable Markov Decision Processes (POMDPs). Expand
Estimating Convergence of Markov chains with L-Lag Couplings
TLDR
L-lag couplings are introduced to generate computable, non-asymptotic upper bound estimates for the total variation or the Wasserstein distance of general Markov chains. Expand
Simultaneous Inference under the Vacuous Orientation Assumption
TLDR
A novel approach to simultaneous inference that alleviates the need to specify a correlational structure among marginal errors by employing the Dempster-Shafer Extended Calculus of Probability, and delivers posterior inference in the form of stochastic three-valued logic. Expand
Unbiased Hamiltonian Monte Carlo with couplings
We propose a method for parallelization of Hamiltonian Monte Carlo estimators. Our approach involves constructing a pair of Hamiltonian Monte Carlo chains that are coupled in such a way that theyExpand
...
1
2
3
4
5
...