• Corpus ID: 252211737

Adaptive inference over Besov spaces in the white noise model using $p$-exponential priors

  title={Adaptive inference over Besov spaces in the white noise model using \$p\$-exponential priors},
  author={Sergios Agapiou and Aimilia Savva},
In many scientific applications the aim is to infer a function which is smooth in some areas, but rough or even dis-continuous in other areas of its domain. Such spatially inhomogeneous functions can be modelled in Besov spaces with suitable integrability parameters. In this work we study adaptive Bayesian inference over Besov spaces, in the white noise model from the point of view of rates of contraction, using p -exponential priors, which range between Laplace and Gaussian and possess… 



Laplace priors and spatial inhomogeneity in Bayesian inverse problems

Spatially inhomogeneous functions, which may be smooth in some regions and rough in other regions, are mod-elled naturally in a Bayesian manner using so-called Besov priors which are given by random

Besov priors in density estimation: optimal posterior contraction rates and adaptation

Besov priors are nonparametric priors that model spatially inhomogeneous functions. They are routinely used in inverse problems and imaging, where they exhibit attractive sparsity-promoting and

Besov priors for Bayesian inverse problems

We consider the inverse problem of estimating a function $u$ from noisy, possibly nonlinear, observations. We adopt a Bayesian approach to the problem. This approach has a long history for inversion,

Consistency of Bayesian inference with Gaussian process priors in an elliptic inverse problem

It is shown that, as the number N of measurements increases, the resulting posterior distributions concentrate around the true parameter generating the data, and derive a convergence rate N−λ, λ > 0, for the reconstruction error of the associated posterior means, in L2(O)-distance.

Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors in High Dimensions

This article studies robustness of an MCMC algorithm for posterior inference, which refers to MCMC convergence rates that do not deteriorate as the discretisation becomes finer and borrows strength from the well-developed non-centred methodology for Bayesian hierarchical models.

Minimax estimation via wavelet shrinkage

A nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coefficients is developed, andVariants of this method based on simple threshold nonlinear estimators are nearly minimax.

Wavelet thresholding via a Bayesian approach

We discuss a Bayesian formalism which gives rise to a type of wavelet threshold estimation in nonparametric regression. A prior distribution is imposed on the wavelet coefficients of the unknown

Rates of contraction of posterior distributions based on p-exponential priors

We consider a family of infinite dimensional product measures with tails between Gaussian and exponential, which we call $p$-exponential measures. We study their measure-theoretic properties and in

Analysis of the Gibbs Sampler for Hierarchical Inverse Problems

The intuition on the behavior of the prior hyperparameter, with and without reparametrization, is sufficiently general to include a broad class of nonlinear inverse problems as well as other families of hyperpriors.

Empirical Bayes scaling of Gaussian priors in the white noise model

The behavior of the random hyperparameter is characterized, and it is shown that a nonparametric Bayes method using it gives optimal recovery over a scale of regularity classes.