Accelerating MCMC with active subspaces

@inproceedings{Constantine2015AcceleratingMW,
  title={Accelerating MCMC with active subspaces},
  author={Paul G. Constantine and Carson Kent and Tan Bui-Thanh},
  year={2015}
}
The Markov chain Monte Carlo (MCMC) method is the computational workhorse for Bayesian inverse problems. However, MCMC struggles in high-dimensional parameter spaces, since its iterates must sequentially explore the high-dimensional space. This struggle is compounded in physical applications when the nonlinear forward model is computationally expensive. One approach to accelerate MCMC is to reduce the dimension of the state space. Active subspaces are part of an emerging set of tools for… Expand
Adaptive dimension reduction to accelerate infinite-dimensional geometric Markov Chain Monte Carlo
  • Shiwei Lan
  • Mathematics, Computer Science
  • J. Comput. Phys.
  • 2019
TLDR
This work takes advantage of dimension reduction techniques to accelerate the original ∞-GMC algorithms and uses partial spectral decomposition of the Gaussian-approximate posterior covariance operator to identify certain number of principal eigen-directions as a basis for the intrinsic subspace. Expand
Geometric MCMC for infinite-dimensional inverse problems
TLDR
This work combines geometric methods on a finite-dimensional subspace with mesh-independent infinite-dimensional approaches to speed up MCMC mixing times, while retaining robust mixing times as the dimension grows by using pCN-like methods in the complementary subspace. Expand
Data-free likelihood-informed dimension reduction of Bayesian inverse problems
TLDR
A novel gradient-based dimension reduction method in which the informed subspace does not depend on the data, which permits online–offline computational strategy where the expensive low-dimensional structure of the problem is detected in an offline phase, meaning before observing the data. Expand
Bayesian model calibration on active subspaces
TLDR
A Delayed Rejection Adaptive Metropolis algorithm is employed to infer parameter distributions on the active subspace and then map these distributions back to the full space. Expand
Conditioning by Projection for the Sampling from Prior Gaussian Distributions
TLDR
A Bayesian statistical framework with a preconditioned Markov Chain Monte Carlo (MCMC) algorithm for the solution of the inverse problem for absolute permeability characterization and presents a new method to condition Gaussian fields to available sparse measurements. Expand
Simulator-free solution of high-dimensional stochastic elliptic partial differential equations using deep neural networks
TLDR
This work proposes a novel methodology for high-dimensional uncertainty propagation of elliptic SPDEs which lifts the requirement for a deterministic forward solver and introduces a physics-informed loss function derived from variational principles. Expand
Forward and backward uncertainty quantification with active subspaces: Application to hypersonic flows around a cylinder
TLDR
A Bayesian calibration of the freestream velocity and density is performed starting from measurements of the pressure and heat flux at the stagnation point of a hypersonic high-enthalpy flow around a cylinder to explore the possibility of using stagnation heat flux measurements, together with pressure measurements, to rebuildFreestream conditions. Expand
Efficient parameter estimation for a methane hydrate model with active subspaces
TLDR
Bayesian inference of the parameters of a state-of-the-art mathematical model for methane hydrates based on experimental data from a triaxial compression test with gas hydrate-bearing sand is performed in an efficient way by utilizing active subspaces. Expand
Bayesian inference of random fields represented with the Karhunen–Loève expansion
Abstract The integration of data into engineering models involving uncertain and spatially varying parameters is oftentimes key to obtaining accurate predictions. Bayesian inference is effective inExpand
Applications of Bayesian computational statistics and modeling to large-scale geoscientific problems
Climate change is one of the most important, pressing, and furthest reaching global challenges that humanity faces in the 21st century. Already affecting daily lives of many directly and everyoneExpand
...
1
2
...

References

SHOWING 1-10 OF 32 REFERENCES
Computing active subspaces with Monte Carlo
Active subspaces can effectively reduce the dimension of high-dimensional parameter studies enabling otherwise infeasible experiments with expensive simulations. The key components of active subspaceExpand
Dimension-independent likelihood-informed MCMC
TLDR
This work introduces a family of Markov chain Monte Carlo samplers that can adapt to the particular structure of a posterior distribution over functions that may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Expand
Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
TLDR
This work considers a Bayesian approach to nonlinear inverse problems in which the unknown quantity is a spatial or temporal field, endowed with a hierarchical Gaussian process prior, and introduces truncated Karhunen-Loeve expansions, based on the prior distribution, to efficiently parameterize the unknown field. Expand
Accelerating Markov Chain Monte Carlo Simulation by Differential Evolution with Self-Adaptive Randomized Subspace Sampling
Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework.Expand
A Stochastic Newton MCMC Method for Large-Scale Statistical Inverse Problems with Application to Seismic Inversion
TLDR
This work addresses the solution of large-scale statistical inverse problems in the framework of Bayesian inference with a so-called Stochastic Monte Carlo method. Expand
Riemann manifold Langevin and Hamiltonian Monte Carlo methods
The paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined on the Riemann manifold to resolve the shortcomings of existing Monte Carlo algorithms whenExpand
Sampling the posterior: An approach to non-Gaussian data assimilation
The viewpoint taken in this paper is that data assimilation is fundamentally a statistical problem and that this problem should be cast in a Bayesian framework. In the absence of model error, theExpand
Optimal Low-rank Approximations of Bayesian Linear Inverse Problems
TLDR
Two fast approximations of the posterior mean are proposed and proved optimality with respect to a weighted Bayes risk under squared-error loss and the Hessian of the negative log-likelihood and the prior precision are proved. Expand
Extreme-scale UQ for Bayesian inverse problems governed by PDEs
TLDR
This work addresses uncertainty quantification for large-scale inverse problems in a Bayesian inference framework by exploiting the fact that the data are typically informative about low-dimensional manifolds of parameter space to construct low rank approximations of the covariance matrix of the posterior pdf via a matrix-free randomized method. Expand
Active Subspaces - Emerging Ideas for Dimension Reduction in Parameter Studies
Scientists and engineers use computer simulations to study relationships between a model's input parameters and its outputs. However, thorough parameter studies are challenging, if not impossible,Expand
...
1
2
3
4
...