Corpus ID: 229153979

Non-asymptotic error estimates for the Laplace approximation in Bayesian inverse problems

  title={Non-asymptotic error estimates for the Laplace approximation in Bayesian inverse problems},
  author={Tapio Helin and Remo Kretschmann},
In this paper we study properties of the Laplace approximation of the posterior distribution arising in nonlinear Bayesian inverse problems. Our work is motivated by Schillings et al. (2020), where it is shown that in such a setting the Laplace approximation error in Hellinger distance converges to zero in the order of the noise level. Here, we prove novel error estimates for a given noise level that also quantify the effect due to the nonlinearity of the forward mapping and the dimension of… Expand
1 Citations

Figures from this paper

On log-concave approximations of high-dimensional posterior measures and stability properties in non-linear inverse problems
The problem of efficiently generating random samples from high-dimensional and non-log-concave posterior measures arising from nonlinear regression problems is considered. Extending investigationsExpand


On the convergence of the Laplace approximation and noise-level-robustness of Laplace-based Monte Carlo methods for Bayesian inverse problems
The Bayesian approach to inverse problems provides a rigorous framework for the incorporation and quantification of uncertainties in measurements, parameters and models and shows that Laplace-based importance sampling and La place-based quasi-Monte Carlo methods are robust w.r.t. the concentration of the posterior for large classes of posterior distributions and integrands. Expand
Multivariate Laplace's approximation with estimated error and application to limit theorems
An approximation for the multivariate Laplace’s integral with a large parameter is obtained and an estimate error term is estimated for two cases, when the maximum of the exponent is in the interior of the domain and on the boundary. Expand
Hessian-based adaptive sparse quadrature for infinite-dimensional Bayesian inverse problems☆
In this work we propose and analyze a Hessian-based adaptive sparse quadrature to compute infinite-dimensional integrals with respect to the posterior distribution in the context of Bayesian inverseExpand
On the Bernstein-Von Mises Theorem for High Dimensional Nonlinear Bayesian Inverse Problems
We prove a Bernstein-von Mises theorem for a general class of high dimensional nonlinear Bayesian inverse problems in the vanishing noise limit. We propose a sufficient condition on the growth rateExpand
Fast estimation of expected information gains for Bayesian experimental designs based on Laplace approximations
Shannon-type expected information gain can be used to evaluate the relevance of a proposed experiment subjected to uncertainty. The estimation of such gain, however, relies on a double-loopExpand
Bernstein-von Mises Theorems and Uncertainty Quantification for Linear Inverse Problems
It is proved that semiparametric posterior estimation and uncertainty quantification are valid and optimal from a frequentist point of view, and frequentist guarantees for certain credible balls centred at $\bar{f}$ are derived. Expand
Error bounds for multidimensional Laplace approximation
Abstract A numerical estimate is obtained for the error associated with the Laplace approximation of the double integral I ( λ ) = ∝∝ D g ( x , y ) e − λf ( x , y ) dx dy , where D is a domain in R 2Expand
A Fast and Scalable Method for A-Optimal Design of Experiments for Infinite-dimensional Bayesian Nonlinear Inverse Problems
This work constructs a Gaussian approximation to the posterior at the maximum a posteriori probability (MAP) point, and uses the resulting covariance operator to define the OED objective function, which is derived by generalizing the classical A-optimal experimental design criterion. Expand
Error bounds for the laplace approximation for definite integrals
Abstract Explicit error bounds are obtained for the well-known asymptotic expansion of integrals of the form ∫ab e−λp(x)q(x)dx, in which λ is a large positive parameter, p(x) and q(x) are realExpand
Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain
Abstract In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number ofExpand