Hessian-based adaptive sparse quadrature for infinite-dimensional Bayesian inverse problems☆

@article{Chen2017HessianbasedAS,
  title={Hessian-based adaptive sparse quadrature for infinite-dimensional Bayesian inverse problems☆},
  author={P. Chen and U. Villa and O. Ghattas},
  journal={Computer Methods in Applied Mechanics and Engineering},
  year={2017},
  volume={327},
  pages={147-172}
}
In this work we propose and analyze a Hessian-based adaptive sparse quadrature to compute infinite-dimensional integrals with respect to the posterior distribution in the context of Bayesian inverse problems with Gaussian prior. Due to the concentration of the posterior distribution in the domain of the prior distribution, a prior-based parametrization and sparse quadrature may fail to capture the posterior distribution and lead to erroneous evaluation results. By using a parametrization based… Expand

Figures from this paper

On the convergence of the Laplace approximation and noise-level-robustness of Laplace-based Monte Carlo methods for Bayesian inverse problems
TLDR
The Bayesian approach to inverse problems provides a rigorous framework for the incorporation and quantification of uncertainties in measurements, parameters and models and shows that Laplace-based importance sampling and La place-based quasi-Monte Carlo methods are robust w.r.t. the concentration of the posterior for large classes of posterior distributions and integrands. Expand
Stein variational reduced basis Bayesian inversion
TLDR
A Stein variational reduced basis method (SVRB) to solve large-scale PDE-constrained Bayesian inverse problems and develops an adaptive and goal-oriented model reduction technique based on reduced basis approximations for the evaluation of the potential and its gradient. Expand
Multilevel adaptive sparse Leja approximations for Bayesian inverse problems
TLDR
The proposed multilevel adaptive sparse Leja algorithm is applied in numerical experiments involving elliptic inverse problems in 2D and 3D space, in which it is compared with Markov chain Monte Carlo sampling and a standard multileVEL approximation. Expand
Projected Stein Variational Newton: A Fast and Scalable Bayesian Inference Method in High Dimensions
TLDR
A fast and scalable variational method for Bayesian inference in high-dimensional parameter space, which is called projected Stein variational Newton (pSVN) method, and demonstrates fast convergence of the proposed method and its scalability with respect to the number of parameters, samples, and processor cores. Expand
Multilevel Adaptive Sparse Leja Approximations
Deterministic interpolation and quadrature methods are often unsuitable to address Bayesian inverse problems depending on computationally expensive forward mathematical models. While interpolationExpand
Non-asymptotic error estimates for the Laplace approximation in Bayesian inverse problems
TLDR
This work proves novel error estimates for a given noise level that also quantify the effect due to the nonlinearity of the forward mapping and the dimension of the problem and provides insight into Bayesian inference in nonlinear inverse problems, where linearization of theforward mapping has suitable approximation properties. Expand
Projected Wasserstein gradient descent for high-dimensional Bayesian inference
TLDR
This work forms a projected Wasserstein gradient flow and analyzes its convergence property under mild assumptions and illustrates the accuracy, convergence, and complexity scalability of pWGD with respect to parameter dimension, sample size, and processor cores. Expand
Sparse Quadrature for High-Dimensional Integration with Gaussian Measure
In this work we analyze the dimension-independent convergence property of an abstract sparse quadrature scheme for numerical integration of functions of high-dimensional parameters with GaussianExpand
Hessian-based sampling for high-dimensional model reduction.
TLDR
This work develops a Hessian-based sampling method for the construction of goal-oriented reduced order models with high-dimensional parameter inputs that leads to much smaller errors of the reduced basis approximation for the QoI compared to a random sampling for a diffusion equation with random input obeying either uniform or Gaussian distributions. Expand
Taylor approximation and variance reduction for PDE-constrained optimal control under uncertainty
TLDR
A scalable computational framework for the solution of PDE-constrained optimal control problems under high-dimensional uncertainty and uses the Lagrangian formalism to derive expressions for the gradient with respect to the control and applies a gradient-based optimization method to solve the problem. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 33 REFERENCES
Optimal Low-rank Approximations of Bayesian Linear Inverse Problems
TLDR
Two fast approximations of the posterior mean are proposed and proved optimality with respect to a weighted Bayes risk under squared-error loss and the Hessian of the negative log-likelihood and the prior precision are proved. Expand
Sparse deterministic approximation of Bayesian inverse problems
We present a parametric deterministic formulation of Bayesian inverse problems with an input parameter from infinite-dimensional, separable Banach spaces. In this formulation, the forward problemsExpand
Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations
  • Peng Chen, C. Schwab
  • Mathematics, Computer Science
  • J. Comput. Phys.
  • 2016
TLDR
The present work generalizes 49 to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI) and proposes to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. Expand
Sparse-grid, reduced-basis Bayesian inversion
Abstract We analyze reduced basis (RB for short) short acceleration of recently proposed sparse Bayesian inversion algorithms for partial differential equations with uncertain distributed parameter,Expand
Sparse, adaptive Smolyak quadratures for Bayesian inverse problems
Based on the parametric deterministic formulation of Bayesian inverse problems with unknown input parameter from infinite-dimensional, separable Banach spaces proposed in Schwab and Stuart (2012Expand
Complexity analysis of accelerated MCMC methods for Bayesian inversion
TLDR
This work studies Bayesian inversion for a model elliptic PDE with an unknown diffusion coefficient and bounds the computational complexity of 'plain' MCMC, based on combining MCMC sampling with linear complexity multi-level solvers for elliptIC PDE. Expand
An Adaptive Sparse Grid Algorithm for Elliptic PDEs with Lognormal Diffusion Coefficient
In this work we build on the classical adaptive sparse grid algorithm (T. Gerstner and M. Griebel, Dimension-adaptive tensor-product quadrature), obtaining an enhanced version capable of usingExpand
Sparse Quadrature for High-Dimensional Integration with Gaussian Measure
In this work we analyze the dimension-independent convergence property of an abstract sparse quadrature scheme for numerical integration of functions of high-dimensional parameters with GaussianExpand
Dimension-independent likelihood-informed MCMC
TLDR
This work introduces a family of Markov chain Monte Carlo samplers that can adapt to the particular structure of a posterior distribution over functions that may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Expand
A Computational Framework for Infinite-Dimensional Bayesian Inverse Problems, Part II: Stochastic Newton MCMC with Application to Ice Sheet Flow Inverse Problems
TLDR
To address the challenges of sampling high-dimensional pdfs arising from Bayesian inverse problems governed by PDEs, an approximation of the stochastic Newton MCMC method is introduced in which the low-rank-based Hessian is computed at just the MAP point, and then reused at each MCMC step. Expand
...
1
2
3
4
...