Accelerating MCMC with active subspaces

@inproceedings{Constantine2015AcceleratingMW,
  title={Accelerating MCMC with active subspaces},
  author={Paul G. Constantine and Carson Kent and Tan Bui-Thanh},
  year={2015}
}
The Markov chain Monte Carlo (MCMC) method is the computational workhorse for Bayesian inverse problems. However, MCMC struggles in high-dimensional parameter spaces, since its iterates must sequentially explore the high-dimensional space. This struggle is compounded in physical applications when the nonlinear forward model is computationally expensive. One approach to accelerate MCMC is to reduce the dimension of the state space. Active subspaces are part of an emerging set of tools for… 

Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov Chain Monte Carlo

  • Shiwei Lan
  • Computer Science, Mathematics
    J. Comput. Phys.
  • 2019

Randomized Maximum Likelihood via High-Dimensional Bayesian Optimization

This work proposes a new methodology for tackling the RML optimization problem based on the high-dimensional Bayesian optimization literature, and demonstrates the benefits of the methodology in comparison with the solutions obtained by alternative optimization methods on a variety of synthetic and real-world problems, including medical anduid dynamics applications.

Applications of Bayesian computational statistics and modeling to large-scale geoscientific problems

The particular geoscientific problems considered are finding the spatio-temporal distribution of atmospheric carbon dioxide based on sparse remote sensing data, quantifying uncertainties in modeling methane emissions from boreal wetlands, analyzing and quantifying the effect of climate change on growing season in the boreal region, and using statistical methods to calibrate a terrestrial ecosystem model.

Bayesian model calibration on active subspaces

A Delayed Rejection Adaptive Metropolis algorithm is employed to infer parameter distributions on the active subspace and then map these distributions back to the full space.

Conditioning by Projection for the Sampling from Prior Gaussian Distributions

A Bayesian statistical framework with a preconditioned Markov Chain Monte Carlo (MCMC) algorithm for the solution of the inverse problem for absolute permeability characterization and presents a new method to condition Gaussian fields to available sparse measurements.

References

SHOWING 1-10 OF 31 REFERENCES

Likelihood-informed dimension reduction for nonlinear inverse problems

The intrinsic dimensionality of an inverse problem is affected by prior information, the accuracy and number of observations, and the smoothing properties of the forward operator. From a Bayesian

Accelerating Markov Chain Monte Carlo Simulation by Differential Evolution with Self-Adaptive Randomized Subspace Sampling

The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems andErgodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches.

Riemann manifold Langevin and Hamiltonian Monte Carlo methods

  • M. GirolamiB. Calderhead
  • Computer Science
    Journal of the Royal Statistical Society: Series B (Statistical Methodology)
  • 2011
The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.

Solving large-scale PDE-constrained Bayesian inverse problems with Riemann manifold Hamiltonian Monte Carlo

This work forms the Gauss–Newton Hessian at the maximum a posteriori point and uses it as a fixed constant metric tensor throughout RMHMC simulation, which eliminates the need for the computationally costly differential geometric Christoffel symbols, and greatly reduces computational effort at a corresponding loss of sampling efficiency.

Optimal Low-rank Approximations of Bayesian Linear Inverse Problems

Two fast approximations of the posterior mean are proposed and proved optimality with respect to a weighted Bayes risk under squared-error loss and the Hessian of the negative log-likelihood and the prior precision are proved.

Extreme-scale UQ for Bayesian inverse problems governed by PDEs

This work addresses uncertainty quantification for large-scale inverse problems in a Bayesian inference framework by exploiting the fact that the data are typically informative about low-dimensional manifolds of parameter space to construct low rank approximations of the covariance matrix of the posterior pdf via a matrix-free randomized method.

Active Subspaces - Emerging Ideas for Dimension Reduction in Parameter Studies

Scientists and engineers use computer simulations to study relationships between a model's input parameters and its outputs. However, thorough parameter studies are challenging, if not impossible,

Erratum: Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces

This work presents a method to first detect the directions of the strongest variability using evaluations of the gradient and subsequently exploit these directions to construct a response surface on a low-dimensional subspace---i.e., the active subspace ---of the inputs.