Corpus ID: 219177033

Rate-optimal refinement strategies for local approximation MCMC

  title={Rate-optimal refinement strategies for local approximation MCMC},
  author={Andrew D. Davis and Youssef M. Marzouk and Aaron Smith and Natesh S. Pillai},
  journal={arXiv: Computation},
Many Bayesian inference problems involve target distributions whose density functions are computationally expensive to evaluate. Replacing the target density with a local approximation based on a small number of carefully chosen density evaluations can significantly reduce the computational expense of Markov chain Monte Carlo (MCMC) sampling. Moreover, continual refinement of the local approximation can guarantee asymptotically exact sampling. We devise a new strategy for balancing the decay… Expand
A survey of Monte Carlo methods for noisy and costly densities with application to reinforcement learning
This survey gives an overview of Monte Carlo methodologies using surrogate models, for dealing with densities which are intractable, costly, and/or noisy, and a modular scheme which encompasses the considered methods is presented. Expand
Efficient Bayesian inference for large chaotic dynamical systems
A likelihood function suited to chaotic models is constructed by evaluating a distribution over distances between points in the phase space; this distribution defines a summary statistic that depends on the geometry of the attractor, rather than on pointwise matching of trajectories. Expand
Variational Inference with NoFAS: Normalizing Flow with Adaptive Surrogate for Computationally Expensive Models
  • Yu Wang, Fang Liu, D. Schiavazzi
  • Computer Science, Mathematics
  • ArXiv
  • 2021
Normalizing Flow with Adaptive Surrogate (NoFAS) is proposed, an optimization strategy that alternatively updates the normalizing flow parameters and the weights of a neural network surrogate model and an efficient sample weighting scheme for surrogate model training that ensures some global accuracy of the surrogate while capturing the likely regions of the parameters that yield the observed data. Expand
Context-aware surrogate modeling for balancing approximation and sampling costs in multi-fidelity importance sampling and Bayesian inverse problems
Numerical examples demonstrate that optimal---context-aware---surrogate models for multi-fidelity importance sampling have lower fidelity than what typically is set as tolerance in traditional model reduction, leading to runtime speedups of up to one order of magnitude in the presented examples. Expand
Efficient Bayesian inference for large chaotic dynamical systems
This work constructs a likelihood function suited to chaotic models by evaluating a distribution over distances between points in the phase space, and develops an inexpensive surrogate for the log-likelihood via local approximation Markov chain Monte Carlo which in simulations reduces the time required for accurate inference by orders of magnitude. Expand


Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations
ABSTRACT We construct a new framework for accelerating Markov chain Monte Carlo in posterior sampling problems where standard methods are limited by the computational cost of the likelihood, or ofExpand
Parallel Local Approximation MCMC for Expensive Models
It is proved that samplers running in parallel can collaboratively construct a shared posterior approximation while ensuring ergodicity of each associated chain, providing a novel opportunity for exploiting parallel computation in MCMC. Expand
Local Derivative-Free Approximation of Computationally Expensive Posterior Densities
  • N. Bliznyuk, David Ruppert, C. Shoemaker
  • Mathematics, Medicine
  • Journal of computational and graphical statistics : a joint publication of American Statistical Association, Institute of Mathematical Statistics, Interface Foundation of North America
  • 2012
A derivative-free algorithm GRIMA is developed to accurately approximate π by interpolation over its high-probability density (HPD) region, which is initially unknown, and is applicable to approximation of general unnormalized posterior densities. Expand
Optimal approximating Markov chains for Bayesian inference
The Markov Chain Monte Carlo method is the dominant paradigm for posterior computation in Bayesian analysis. It is common to control computation time by making approximations to the Markov transitionExpand
Adaptive Construction of Surrogates for the Bayesian Solution of Inverse Problems
This work presents a new approach that uses stochastic optimization to construct polynomial approximations over a sequence of distributions adaptively determined from the data, eventually concentrating on the posterior distribution. Expand
Ergodicity of Approximate MCMC Chains with Applications to Large Data Sets
In many modern applications, difficulty in evaluating the posterior density makes performing even a single MCMC step slow. This difficulty can be caused by intractable likelihood functions, but alsoExpand
Markov chain Monte Carlo Using an Approximation
This article presents a method for generating samples from an unnormalized posterior distribution f(·) using Markov chain Monte Carlo (MCMC) in which the evaluation of f(·) is very difficult orExpand
A stochastic collocation approach to Bayesian inference in inverse problems
We present an efficient numerical strategy for the Bayesian solution of inverse problems. Stochastic collocation methods, based on generalized polynomial chaos (gPC), are used to construct aExpand
A Hierarchical Multilevel Markov Chain Monte Carlo Algorithm with Applications to Uncertainty Quantification in Subsurface Flow
An abstract, problem-dependent theorem is given on the cost of the new multilevel estimator based on a set of simple, verifiable assumptions for a typical model problem in subsurface flow and shows significant gains over the standard Metropolis--Hastings estimator. Expand
Perturbation theory for Markov chains via Wasserstein distance
Perturbation theory for Markov chains addresses the question how small differences in the transitions of Markov chains are reflected in differences between their distributions. We prove powerful andExpand