Adaptive Bayesian estimation using a Gaussian random field with inverse Gamma bandwidth

@article{Vaart2009AdaptiveBE,
  title={Adaptive Bayesian estimation using a Gaussian random field with inverse Gamma bandwidth},
  author={Aad W. Vaart and J. H. van Zanten},
  journal={Annals of Statistics},
  year={2009},
  volume={37},
  pages={2655-2675}
}
We consider nonparametric Bayesian estimation inference using a rescaled smooth Gaussian field as a prior for a multidimensional function. The rescaling is achieved using a Gamma variable and the procedure can be viewed as choosing an inverse Gamma bandwidth. The procedure is studied from a frequentist perspective in three statistical settings involving replicated observations (density estimation, regression and classification). We prove that the resulting posterior distribution shrinks to the… 

Adaptive Bayesian density estimation using Pitman-Yor or normalized inverse-Gaussian process kernel mixtures

We consider Bayesian nonparametric density estimation using a Pitman-Yor or a normalized inverse-Gaussian process kernel mixture as the prior distribution for a density. The procedure is studied from

Optimal Bayesian estimation in random covariate design with a rescaled Gaussian process prior

TLDR
It is shown that an appropriate rescaling of the above Gaussian process leads to a rate-optimal posterior distribution even when the covariates are independently realized from a known density on a compact set.

MCMC-free adaptive Bayesian procedures using random series prior

TLDR
The random series prior can be viewed as an alternative to commonly used Gaussian process prior, but can be analyzed by relatively simpler techniques and in many cases allows a simpler approach to computation without using Markov chain Monte-Carlo methods.

Adaptive Bayesian procedures using random series prior

TLDR
A prior for nonparametric Bayesian estimation which uses finite random series with a random number of terms and derives a general result on adaptive posterior convergence rates for all smoothness levels of the function in the true model by constructing an appropriate “sieve” and applying the general theory of posterior converge rates.

Adaptive Bayesian Density Estimation in $L^{p}$-metrics with Pitman-Yor or Normalized Inverse-Gaussian Process Kernel Mixtures

We consider Bayesian nonparametric density estimation using a Pitman-Yor or a normalized inverse-Gaussian process convolution kernel mixture as the prior distribution for a density. The procedure is

Bayesian Optimal Adaptive Estimation Using a Sieve Prior

TLDR
It is possible to prove that the adaptive Bayesian approach for the l2 loss is strongly suboptimal and the rate is provided, and a lower bound is provided on the rate.

Adaptive Bayesian Procedures Using Random Series Priors

We consider a general class of prior distributions for nonparametric Bayesian estimation which uses finite random series with a random number of terms. A prior is constructed through distributions on

Adaptive Bayesian multivariate density estimation with Dirichlet mixtures BY WEINING SHEN

We show that rate-adaptive multivariate density estimation can be performed using Bayesian methods based on Dirichlet mixtures of normal kernels with a prior distribution on the kernel’s covariance

BAYESIAN REGRESSION WITH NONPARAMETRIC

This paper presents a large sample justification for a semiparametric Bayesian approach to inference in a linear regression model. The approach is to model the distribution of the error term by a

ANISOTROPIC FUNCTION ESTIMATION USING MULTI-BANDWIDTH GAUSSIAN PROCESSES.

TLDR
This work defines a Bayesian procedure that leads to the minimax optimal rate of posterior contraction adapting to the unknown dimension and anisotropic smoothness of the true surface and proposes such an approach based on a Gaussian process prior with dimension-specific scalings, which are assigned carefully-chosen hyperpriors.
...

References

SHOWING 1-10 OF 58 REFERENCES

On universal Bayesian adaptation

We consider estimating a probability density p based on a random sample from this density by a Bayesian approach. The prior is constructed in two steps, by first constructing priors on a collection

Towards a practicable Bayesian nonparametric density estimator

SUMMARY Nonparametric density estimators smooth the empirical distribution function and are sensitive to the choice of smoothing parameters. This paper develops an hierarchical Bayes formulation for

Nonparametric Bayesian model selection and averaging

TLDR
A general theorem is presented on the rate of contraction of the resulting posterior distribution as n, which gives conditions under which the rates depend in a complicated way on the priors, but also that the rate is fairly robust to specification of the prior weights.

Posterior consistency of Gaussian process prior for nonparametric binary regression

TLDR
If the covariance kernel has derivatives up to a desired order and the bandwidth parameter of the kernel is allowed to take arbitrarily small values, it is shown that the posterior distribution is consistent in the L 1 -distance.

Bayesian inference with rescaled Gaussian process priors

TLDR
This work exhibits rescaled Gaussian process priors yielding posteriors that contract around the true parameter at optimal convergence rates and establishes bounds on small deviation probabilities for smooth stationary Gaussian processes.

Rates of contraction of posterior distributions based on Gaussian process priors

TLDR
The rate of contraction of the posterior distribution based on sampling from a smooth density model when the prior models the log density as a (fractionally integrated) Brownian motion is shown to depend on the position of the true parameter relative to the reproducing kernel Hilbert space of the Gaussian process.

Posterior Consistency in Nonparametric Regression Problems under Gaussian Process Priors

TLDR
An extension of the theorem of Schwartz (1965) for nonidentically distributed observations is used, verifying its conditions when using Gaussian process priors for the regression function with normal or double exponential (Laplace) error distributions.

Convergence rates of posterior distributions for non-i.i.d. observations

We consider the asymptotic behavior of posterior distributions and Bayes estimators based on observations which are required to be neither independent nor identically distributed. We give general
...