Rates of contraction of posterior distributions based on Gaussian process priors

@article{Vaart2008RatesOC,
  title={Rates of contraction of posterior distributions based on Gaussian process priors},
  author={Aad W. Vaart and J. H. van Zanten},
  journal={Annals of Statistics},
  year={2008},
  volume={36},
  pages={1435-1463}
}
We derive rates of contraction of posterior distributions on nonparametric or semiparametric models based on Gaussian processes. The rate of contraction is shown to depend on the position of the true parameter relative to the reproducing kernel Hilbert space of the Gaussian process and the small ball probabilities of the Gaussian process. We determine these quantities for a range of examples of Gaussian priors and in several statistical settings. For instance, we consider the rate of… 

Posterior contraction rates for constrained deep Gaussian processes in density estimation and classication

TLDR
This work provides posterior contraction rates for constrained deep Gaussian processes in non-parametric density estimation and classification in integrated Brownian motions, Riemann-Liouville processes, and Matérn processes and to standard smoothness classes of functions.

Lower bounds for posterior rates with Gaussian process priors

Upper bounds for rates of convergence of posterior distributions associated to Gaussian process priors are obtained by van der Vaart and van Zanten in [14] and expressed in terms of a concentration

Posterior consistency via precision operators for Bayesian nonparametric drift estimation in SDEs

TLDR
A Bayesian approach to nonparametric estimation of the periodic drift function of a one-dimensional diffusion from continuous-time data is studied and the rate at which the posterior contracts around the true drift function is bound.

Information Rates of Nonparametric Gaussian Process Methods

TLDR
The results show that for good performance, the regularity of the GP prior should match the regularities of the unknown response function, and is expressible in a certain concentration function.

Non parametric Bayesian drift estimation for one-dimensional diffusion processes

We consider diffusions on the circle and establish a Bayesian estimator for the drift function based on observing the local time and using Gaussian priors. Given a standard Girsanov likelihood, we

Bayesian inference with rescaled Gaussian process priors

TLDR
This work exhibits rescaled Gaussian process priors yielding posteriors that contract around the true parameter at optimal convergence rates and establishes bounds on small deviation probabilities for smooth stationary Gaussian processes.

A Theoretical Framework for Bayesian Nonparametric Regression: Orthonormal Random Series and Rates of Contraction

We develop a unifying framework for Bayesian nonparametric regression to study the rates of contraction with respect to the integrated $L_2$-distance without assuming the regression function space to

Bayesian linear inverse problems in regularity scales

We obtain rates of contraction of posterior distributions in inverse problems defined by scales of smoothness classes. We derive abstract results for general priors, with contraction rates determined

On the Convergence of Bayesian Regression Models

We consider heteroscedastic nonparametric regression models, when both the mean function and variance function are unknown and to be estimated with nonparametric approaches. We derive convergence
...

References

SHOWING 1-10 OF 45 REFERENCES

Convergence rates of posterior distributions for non-i.i.d. observations

We consider the asymptotic behavior of posterior distributions and Bayes estimators based on observations which are required to be neither independent nor identically distributed. We give general

Convergence rates of posterior distributions for Brownian semimartingale models

We consider the asymptotic behaviour of posterior distributions based on continuous observations from a Brownian semimartingale model. We present a general result that bounds the posterior rate of

Posterior convergence rates of Dirichlet mixtures at smooth densities

We study the rates of convergence of the posterior distribution for Bayesian density estimation with Dirichlet mixtures of normal distributions as the prior. The true density is assumed to be twice

Posterior consistency of Gaussian process prior for nonparametric binary regression

TLDR
If the covariance kernel has derivatives up to a desired order and the bandwidth parameter of the kernel is allowed to take arbitrarily small values, it is shown that the posterior distribution is consistent in the L 1 -distance.

The Logistic Normal Distribution for Bayesian, Nonparametric, Predictive Densities

Abstract This article models the common density of an exchangeable sequence of observations by a generalization of the process derived from a logistic transform of a Gaussian process. The support of

Posterior Consistency in Nonparametric Regression Problems under Gaussian Process Priors

TLDR
An extension of the theorem of Schwartz (1965) for nonidentically distributed observations is used, verifying its conditions when using Gaussian process priors for the regression function with normal or double exponential (Laplace) error distributions.

Towards a practicable Bayesian nonparametric density estimator

SUMMARY Nonparametric density estimators smooth the empirical distribution function and are sensitive to the choice of smoothing parameters. This paper develops an hierarchical Bayes formulation for

Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities

We study the rates of convergence of the maximum likelihood estimator (MLE) and posterior distribution in density estimation problems, where the densities are location or location-scale mixtures of

Misspecification in infinite-dimensional Bayesian statistics

We consider the asymptotic behavior of posterior distributions if the model is misspecified. Given a prior distribution and a random sample from a distribution P 0 , which may not be in the support