Stochastic approximation cut algorithm for inference in modularized Bayesian models

@article{Liu2022StochasticAC,
  title={Stochastic approximation cut algorithm for inference in modularized Bayesian models},
  author={Yang Liu and Robert J. B. Goudie},
  journal={Stat. Comput.},
  year={2022},
  volume={32},
  pages={7}
}
  • Y. Liu, R. Goudie
  • Published 2 June 2020
  • Computer Science, Mathematics
  • Stat. Comput.
Bayesian modelling enables us to accommodate complex forms of data and make a comprehensive inference, but the effect of partial misspecification of the model is a concern. One approach in this setting is to modularize the model and prevent feedback from suspect modules, using a cut model. After observing data, this leads to the cut distribution which normally does not have a closed form. Previous studies have proposed algorithms to sample from this distribution, but these algorithms have… 

Figures and Tables from this paper

Asymptotics of cut distributions and robust modular inference using Posterior Bootstrap
Bayesian inference provides a framework to combine an arbitrary number of model components with shared parameters, allowing joint uncertainty estimation and the use of all available data sources.
Generalized Geographically Weighted Regression Model within a Modularized Bayesian Framework
Geographically weighted regression (GWR) models handle geographical dependence through a spatially varying coefficient model and have been widely used in applied science, but its Bayesian extension
Variational inference for cutting feedback in misspecified models
Bayesian analyses combine information represented by different terms in a joint Bayesian model. When one or more of the terms is misspecified, it can be helpful to restrict the use of information

References

SHOWING 1-10 OF 39 REFERENCES
Robust Bayesian Inference via Coarsening
TLDR
This work introduces a novel approach to Bayesian inference that improves robustness to small departures from the model: rather than conditioning on the event that the observed data are generated by the model, one conditions on theevent that the model generates data close to the observedData, in a distributional sense.
An Adaptive Exchange Algorithm for Sampling From Distributions With Intractable Normalizing Constants
Sampling from the posterior distribution for a model whose normalizing constant is intractable is a long-standing problem in statistical research. We propose a new algorithm, adaptive auxiliary
MCMC for Doubly-intractable Distributions
TLDR
This paper provides a generalization of M0ller et al. (2004) and a new MCMC algorithm, which obtains better acceptance probabilities for the same amount of exact sampling, and removes the need to estimate model parameters before sampling begins.
A Random-Discretization Based Monte Carlo Sampling Method and its Applications
Recently, several Monte Carlo methods, for example, Markov Chain Monte Carlo (MCMC), importance sampling and data-augmentation, have been developed for numerical sampling and integration in
Inference from Iterative Simulation Using Multiple Sequences
The Gibbs sampler, the algorithm of Metropolis and similar iterative simulation methods are potentially very helpful for summarizing multivariate distributions. Used naively, however, iterative
Semi-Modular Inference: enhanced learning in multi-modular models by tempering the influence of components
TLDR
A new family of Semi-Modular Inference (SMI) schemes is written down, indexed by an influence parameter, with Bayesian inference and Cut-models as special cases, and a meta-learning criterion and estimation procedure to choose the inference scheme.
Unbiased Markov chain Monte Carlo methods with couplings
Markov chain Monte Carlo (MCMC) methods provide consistent approximations of integrals as the number of iterations goes to 1. MCMC estimators are generally biased after any fixed number of
Bayesian fractional posteriors
We consider the fractional posterior distribution that is obtained by updating a prior distribution via Bayes theorem with a fractional likelihood function, a usual likelihood function raised to a
Bayesian Inference in the Presence of Intractable Normalizing Functions
  • Jaewoo Park, M. Haran
  • Mathematics, Computer Science
    Journal of the American Statistical Association
  • 2018
TLDR
This study compares and contrast the computational and statistical efficiency of these algorithms and discusses their theoretical bases, and provides practical recommendations for practitioners along with directions for future research for Markov chain Monte Carlo methodologists.
An adaptive Metropolis algorithm
A proper choice of a proposal distribution for Markov chain Monte Carlo methods, for example for the Metropolis-Hastings algorithm, is well known to be a crucial factor for the convergence of the
...
1
2
3
4
...