Dimension-independent likelihood-informed MCMC

@article{Cui2016DimensionindependentLM,
  title={Dimension-independent likelihood-informed MCMC},
  author={T. Cui and K. Law and Y. Marzouk},
  journal={J. Comput. Phys.},
  year={2016},
  volume={304},
  pages={109-137}
}
Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well… Expand
Localization for MCMC: sampling high-dimensional posterior distributions with local structure
TLDR
This work investigates how ideas from covariance localization in numerical weather prediction can be used in Markov chain Monte Carlo (MCMC) sampling of high-dimensional posterior distributions arising in Bayesian inverse problems, and discusses the notion of high dimensionality in local problems, which is different from the usual notion of High Dimensionality in function spaceMC. Expand
Localization for MCMC: sampling high-dimensional posterior distributions with banded structure
We investigate how ideas from covariance localization in numerical weather prediction can be used to construct effective Markov chain Monte Carlo (MCMC) methods for sampling high-dimensionalExpand
MALA-within-Gibbs Samplers for High-Dimensional Distributions with Sparse Conditional Structure
TLDR
It is shown that the acceptance ratio and step size of this MCMC sampler are independent of the overall problem dimension when (i) the target distribution has sparse conditional structure, and (ii) this structure is reflected in the partial updating strategy of MALA-within-Gibbs. Expand
Iterative Construction of Gaussian Process Surrogate Models for Bayesian Inference
TLDR
The algorithm aims to mitigate some of the hurdles faced by traditional Markov Chain Monte Carlo samplers, through constructing proposal probability densities that are both, easy to sample and that provide a better approximation to the target density than a simple Gaussian proposal distribution would. Expand
Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors in High Dimensions
A key problem in inference for high dimensional unknowns is the design of sampling algorithms whose performance scales favourably with the dimension of the unknown. A typical setting in which theseExpand
Fast Gibbs sampling for high-dimensional Bayesian inversion
Solving ill-posed inverse problems by Bayesian inference has recently attracted considerable attention. Compared to deterministic approaches, the probabilistic representation of the solution by theExpand
Dimension-Robust MCMC in Bayesian Inverse Problems
TLDR
This article introduces a framework for efficient MCMC sampling in Bayesian inverse problems that capitalizes upon two fundamental ideas in MCMC, non-centred parameterisations of hierarchical models and dimension-robust samplers for latent Gaussian processes. Expand
An adaptive independence sampler MCMC algorithm for infinite dimensional Bayesian inferences
Many scientific and engineering problems require to perform Bayesian inferences in function spaces, in which the unknowns are of infinite dimension. In such problems, many standard Markov Chain MonteExpand
A TV-Gaussian prior for infinite-dimensional Bayesian inverse problems and its numerical implementations
Many scientific and engineering problems require to perform Bayesian inferences in function spaces, in which the unknowns are of infinite dimension. In such problems, choosing an appropriate priorExpand
Geometric MCMC for infinite-dimensional inverse problems
TLDR
This work combines geometric methods on a finite-dimensional subspace with mesh-independent infinite-dimensional approaches to speed up MCMC mixing times, while retaining robust mixing times as the dimension grows by using pCN-like methods in the complementary subspace. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 57 REFERENCES
A Stochastic Newton MCMC Method for Large-Scale Statistical Inverse Problems with Application to Seismic Inversion
TLDR
This work addresses the solution of large-scale statistical inverse problems in the framework of Bayesian inference with a so-called Stochastic Monte Carlo method. Expand
Complexity analysis of accelerated MCMC methods for Bayesian inversion
TLDR
This work studies Bayesian inversion for a model elliptic PDE with an unknown diffusion coefficient and bounds the computational complexity of 'plain' MCMC, based on combining MCMC sampling with linear complexity multi-level solvers for elliptIC PDE. Expand
Proposals which speed up function-space MCMC
  • K. Law
  • Computer Science, Mathematics
  • J. Comput. Appl. Math.
  • 2014
TLDR
Two new basic methods of increasing complexity are introduced, involving characteristic function truncation of high frequencies and Hessian information to interpolate between low and high frequencies for function-space MCMC. Expand
Monte Carlo Sampling Methods Using Markov Chains and Their Applications
SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods andExpand
MCMC Methods for Functions: ModifyingOld Algorithms to Make Them Faster
Many problems arising in applications result in the need to probe a probability distribution for functions. Examples include Bayesian nonparametric statistics and conditioned diffusion processes.Expand
A Computational Framework for Infinite-Dimensional Bayesian Inverse Problems, Part II: Stochastic Newton MCMC with Application to Ice Sheet Flow Inverse Problems
TLDR
To address the challenges of sampling high-dimensional pdfs arising from Bayesian inverse problems governed by PDEs, an approximation of the stochastic Newton MCMC method is introduced in which the low-rank-based Hessian is computed at just the MAP point, and then reused at each MCMC step. Expand
Optimal Low-rank Approximations of Bayesian Linear Inverse Problems
TLDR
Two fast approximations of the posterior mean are proposed and proved optimality with respect to a weighted Bayes risk under squared-error loss and the Hessian of the negative log-likelihood and the prior precision are proved. Expand
Riemann manifold Langevin and Hamiltonian Monte Carlo methods
The paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined on the Riemann manifold to resolve the shortcomings of existing Monte Carlo algorithms whenExpand
Optimal Scaling and Diffusion Limits for the Langevin Algorithm in High Dimensions
The Metropolis-adjusted Langevin (MALA) algorithm is a sampling algorithm which makes local moves by incorporating information about the gradient of the logarithm of the target density. In this paperExpand
Signal processing problems on function space: Bayesian formulation, stochastic PDEs and effective MCMC methods
In this chapter we overview a Bayesian approach to a wide range of signal processing problems in which the goal is to find the signal, which is a solution of an ordinary or stochastic differentialExpand
...
1
2
3
4
5
...