Reproducing kernel Hilbert spaces of Gaussian priors

@article{Vaart2008ReproducingKH,
  title={Reproducing kernel Hilbert spaces of Gaussian priors},
  author={Aad W. Vaart and J. H. van Zanten},
  journal={arXiv: Functional Analysis},
  year={2008},
  volume={3},
  pages={200-222}
}
We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described through a concentration function that is expressed in the reproducing Hilbert space. Absolute continuity of Gaussian measures and concentration inequalities play an important role in understanding… 

Consistency of Gaussian Process Regression in Metric Spaces

TLDR
This paper provides an important step towards the theoretical legitimization of GP regression on manifolds and other non-Euclidean metric spaces.

Rates of contraction of posterior distributions based on Gaussian process priors

TLDR
The rate of contraction of the posterior distribution based on sampling from a smooth density model when the prior models the log density as a (fractionally integrated) Brownian motion is shown to depend on the position of the true parameter relative to the reproducing kernel Hilbert space of the Gaussian process.

Lower bounds for posterior rates with Gaussian process priors

Upper bounds for rates of convergence of posterior distributions associated to Gaussian process priors are obtained by van der Vaart and van Zanten in [14] and expressed in terms of a concentration

Bayesian linear inverse problems in regularity scales

We obtain rates of contraction of posterior distributions in inverse problems defined by scales of smoothness classes. We derive abstract results for general priors, with contraction rates determined

Convergence of latent mixing measures in nonparametric and mixture models

TLDR
The relationships between Wasserstein distances of mixing distributions and f -divergence functionals such as Hellinger and Kullback-Leibler distances on the space of mixture distributions are clarified using various identifiability conditions.

Adaptive inference over Besov spaces in the white noise model using $p$-exponential priors

In many scientific applications the aim is to infer a function which is smooth in some areas, but rough or even dis-continuous in other areas of its domain. Such spatially inhomogeneous functions can

Adaptive estimation of multivariate functions using conditionally Gaussian tensor-product spline priors

We investigate posterior contraction rates for priors on multi- variate functions that are constructed using tensor-product B-spline expan- sions. We prove that using a hierarchical prior with an

Finite Element Representations of Gaussian Processes: Balancing Numerical and Statistical Accuracy

TLDR
The theory implies that, under certain smoothness assumptions, one can reduce the computation and memory cost without hindering the estimation accuracy by setting 𝑛 ≪ 𝓁 in the large 𝐁 asymptotics.

Bayesian inference with rescaled Gaussian process priors

TLDR
This work exhibits rescaled Gaussian process priors yielding posteriors that contract around the true parameter at optimal convergence rates and establishes bounds on small deviation probabilities for smooth stationary Gaussian processes.
...

References

SHOWING 1-10 OF 17 REFERENCES

Rates of contraction of posterior distributions based on Gaussian process priors

TLDR
The rate of contraction of the posterior distribution based on sampling from a smooth density model when the prior models the log density as a (fractionally integrated) Brownian motion is shown to depend on the position of the true parameter relative to the reproducing kernel Hilbert space of the Gaussian process.

Approximation, metric entropy and small ball estimates for Gaussian measures

TLDR
This work relates the small ball behavior of a Gaussian measure μ on a Banach space E with the metric entropy behavior of K μ, the unit ball of the reproducing kernel Hilbert space of μ in E to enable the application of tools and results from functional analysis to small ball problems.

Posterior Consistency in Nonparametric Regression Problems under Gaussian Process Priors

TLDR
An extension of the theorem of Schwartz (1965) for nonidentically distributed observations is used, verifying its conditions when using Gaussian process priors for the regression function with normal or double exponential (Laplace) error distributions.

Metric entropy and the small ball problem for Gaussian measures

Abstract We establish a precise link between the small ball problem for a Gaussian measure μ on a separable Banach space and the metric entropy of the unit ball of the Hubert space H μ generating μ.

Convergence rates of posterior distributions

We consider the asymptotic behavior of posterior distributions and Bayes estimators for infinite-dimensional statistical models. We give general results on the rate of convergence of the posterior

The Gaussian measure of shifted balls

SummaryLet μ be a centered Gaussian measure on a Hilbert spaceH and let $$B_R \subseteq H$$ be the centered ball of radiusR>0. Fora∈H and $$\mathop {\lim }\limits_{t{\mathbf{ }} \to {\mathbf{

Probability on Banach spaces

Topology and Normed Spaces

Functional Analysis. McGraw-Hill Book Co., New York

  • 1973