Stochastic approximation cut algorithm for inference in modularized Bayesian models

@article{Liu2022StochasticAC,
  title={Stochastic approximation cut algorithm for inference in modularized Bayesian models},
  author={Yang Liu and Robert J. B. Goudie},
  journal={Statistics and computing},
  year={2022},
  volume={32}
}
Bayesian modelling enables us to accommodate complex forms of data and make a comprehensive inference, but the effect of partial misspecification of the model is a concern. One approach in this setting is to modularize the model and prevent feedback from suspect modules, using a cut model. After observing data, this leads to the cut distribution which normally does not have a closed form. Previous studies have proposed algorithms to sample from this distribution, but these algorithms have… 
Cutting feedback and modularized analyses in generalized Bayesian inference
TLDR
This work examines cutting feedback methods in the context of generalized posterior distributions, i.e., posteriors built from arbitrary loss functions, and provides novel results on their behaviour.
Efficient Bayesian estimation and use of cut posterior in semiparametric hidden Markov models
We consider the problem of estimation in Hidden Markov models with finite state space and nonparametric emission distributions. Efficient estimators for the transition matrix are exhibited, and a
Interoperability of Statistical Models in Pandemic Preparedness: Principles and Reality
TLDR
"Interoperability" is presented as a guiding framework for statistical modelling to assist policy makers asking multiple questions using diverse datasets in the face of an evolving pandemic response through case studies for inferring spatial-temporal coronavirus disease 2019 prevalence and reproduction numbers in England.
Modularized Bayesian analyses and cutting feedback in likelihood-free inference
TLDR
A semi-modular approach to likelihood-free inference where feedback is partially cut based on Gaussian mixture approximations to the joint distribution of parameters and data summary statistics is developed.
Scalable Semi-Modular Inference with Variational Meta-Posteriors
TLDR
Variational methods for approximating the Cut and SMI posteriors which are adapted to the inferential goals of evidence combination are given and it is shown that analysis of models with multiple cuts is feasible using a new Variational Meta-Posterior.
Statistical Challenges in Tracking the Evolution of SARS-CoV-2
TLDR
The models and methods currently used to monitor the spread of SARS-CoV-2 are described, long-standing and new statistical challenges are discussed, and a method for tracking the rise of novel variants during the epidemic is proposed.
Valid belief updates for prequentially additive loss functions arising in Semi-Modular Inference
TLDR
It is shown that prequential additivity is sufficient to determine the optimal valid and order-coherent belief update and that this belief update coincides with the belief update in each of the SMI schemes.
Asymptotics of cut distributions and robust modular inference using Posterior Bootstrap
Bayesian inference provides a framework to combine an arbitrary number of model components with shared parameters, allowing joint uncertainty estimation and the use of all available data sources.
Generalized Geographically Weighted Regression Model within a Modularized Bayesian Framework
TLDR
A Bayesian GWR model is presented and its essence is dealing with partial misspecification of the model, which is justified via an information risk minimization approach and the consistency of the proposed estimator is shown in terms of a geographically weighted KL divergence.
Variational inference for cutting feedback in misspecified models
TLDR
These methods are faster than existing Markov chain Monte Carlo approaches for computing cut posterior distributions by an order of magnitude, and allow for the evaluation of computationally intensive conflict checks that can be used to decide whether or not feedback should be cut.

References

SHOWING 1-10 OF 39 REFERENCES
Robust Bayesian Inference via Coarsening
TLDR
This work introduces a novel approach to Bayesian inference that improves robustness to small departures from the model: rather than conditioning on the event that the observed data are generated by the model, one conditions on theevent that the model generates data close to the observedData, in a distributional sense.
An Adaptive Exchange Algorithm for Sampling From Distributions With Intractable Normalizing Constants
TLDR
The numerical results indicate that the new algorithm, adaptive auxiliary variable exchange algorithm, or, in short, adaptive exchange (AEX) algorithm, is particularly useful for the problems for which the underlying system is strongly dependent.
MCMC for Doubly-intractable Distributions
TLDR
This paper provides a generalization of M0ller et al. (2004) and a new MCMC algorithm, which obtains better acceptance probabilities for the same amount of exact sampling, and removes the need to estimate model parameters before sampling begins.
A Random-Discretization Based Monte Carlo Sampling Method and its Applications
TLDR
A simple numerical sampling based method is systematically developed, based on the concept of random discretization of the density function with respect to Lebesgue measure, which is non-iterative, dimension-free, easy to implement and fast in computing time.
Inference from Iterative Simulation Using Multiple Sequences
TLDR
The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.
Semi-Modular Inference: enhanced learning in multi-modular models by tempering the influence of components
TLDR
A new family of Semi-Modular Inference (SMI) schemes is written down, indexed by an influence parameter, with Bayesian inference and Cut-models as special cases, and a meta-learning criterion and estimation procedure to choose the inference scheme.
Unbiased Markov chain Monte Carlo methods with couplings
TLDR
The theoretical validity of the estimators proposed and their efficiency relative to the underlying MCMC algorithms are established and the performance and limitations of the method are illustrated.
Bayesian fractional posteriors
We consider the fractional posterior distribution that is obtained by updating a prior distribution via Bayes theorem with a fractional likelihood function, a usual likelihood function raised to a
Bayesian Inference in the Presence of Intractable Normalizing Functions
TLDR
This study compares and contrast the computational and statistical efficiency of these algorithms and discusses their theoretical bases, and provides practical recommendations for practitioners along with directions for future research for Markov chain Monte Carlo methodologists.
An adaptive Metropolis algorithm
TLDR
An adaptive Metropolis (AM) algorithm, where the Gaussian proposal distribution is updated along the process using the full information cumulated so far, which establishes here that it has the correct ergodic properties.
...
1
2
3
4
...