Adaptive nonparametric Bayesian inference using location-scale mixture priors

@article{Jonge2010AdaptiveNB,
  title={Adaptive nonparametric Bayesian inference using location-scale mixture priors},
  author={Robert de Jonge and J. H. van Zanten},
  journal={Annals of Statistics},
  year={2010},
  volume={38},
  pages={3300-3320}
}
We study location-scale mixture priors for nonparametric statistical problems, including multivariate regression, density estimation and classification. We show that a rate-adaptive procedure can be obtained if the prior is properly constructed. In particular, we show that adaptation is achieved if a kernel mixture prior on a regression function is constructed using a Gaussian kernel, an inverse gamma bandwidth, and Gaussian mixing weights. 

Adaptive Bayesian Estimation via Block Prior

TLDR
A novel block prior is proposed for adaptive Bayesian estimation that puts sufficient prior mass near the true signal and automatically concentrates on its effective dimension.

Rate exact Bayesian adaptation with modified block priors

TLDR
A novel block prior is proposed for adaptive Bayesian estimation that puts sufficient prior mass near the true signal and automatically concentrates on its effective dimension.

Adaptive Bayesian multivariate density estimation with Dirichlet mixtures

We show that rate-adaptive multivariate density estimation can be performed using Bayesian methods based on Dirichlet mixtures of normal kernels with a prior distribution on the kernel's covariance

Adaptive Bayesian multivariate density estimation with Dirichlet mixtures BY WEINING SHEN

We show that rate-adaptive multivariate density estimation can be performed using Bayesian methods based on Dirichlet mixtures of normal kernels with a prior distribution on the kernel’s covariance

Adaptive Bayesian density estimation using Pitman-Yor or normalized inverse-Gaussian process kernel mixtures

We consider Bayesian nonparametric density estimation using a Pitman-Yor or a normalized inverse-Gaussian process kernel mixture as the prior distribution for a density. The procedure is studied from

Adaptive estimation of multivariate functions using conditionally Gaussian tensor-product spline priors

We investigate posterior contraction rates for priors on multi- variate functions that are constructed using tensor-product B-spline expan- sions. We prove that using a hierarchical prior with an

ANISOTROPIC FUNCTION ESTIMATION USING MULTI-BANDWIDTH GAUSSIAN PROCESSES.

TLDR
This work defines a Bayesian procedure that leads to the minimax optimal rate of posterior contraction adapting to the unknown dimension and anisotropic smoothness of the true surface and proposes such an approach based on a Gaussian process prior with dimension-specific scalings, which are assigned carefully-chosen hyperpriors.

Adaptive Bayesian Density Estimation in $L^{p}$-metrics with Pitman-Yor or Normalized Inverse-Gaussian Process Kernel Mixtures

We consider Bayesian nonparametric density estimation using a Pitman-Yor or a normalized inverse-Gaussian process convolution kernel mixture as the prior distribution for a density. The procedure is

Wasserstein convergence in Bayesian deconvolution models

We study the reknown deconvolution problem of recovering a distribution function from independent replicates (signal) additively contaminated with random errors (noise), whose distribution is known.

a Bayesian nonparametric approach to the sparse regression problem based on mixtures emerged from works of Abramovich

TLDR
A new Gibbs sampler for simulating the posterior is proposed and adaptive posterior rates of convergence related to the Gaussian mean regression problem are established.
...

References

SHOWING 1-10 OF 33 REFERENCES

Adaptive Bayesian estimation using a Gaussian random field with inverse Gamma bandwidth

TLDR
It is proved that the resulting posterior distribution shrinks to the distribution that generates the data at a speed which is minimax-optimal up to a logarithmic factor, whatever the regularity level of the data-generating distribution.

Convergence rates for posterior distributions and adaptive estimation

The goal of this paper is to provide theorems on convergence rates of posterior distributions that can be applied to obtain good convergence rates in the context of density estimation as well as

POSTERIOR CONSISTENCY OF DIRICHLET MIXTURES IN DENSITY ESTIMATION

A Dirichlet mixture of normal densities is a useful choice for a prior distribution on densities in the problem of Bayesian density estimation. In the recent years, efficient Markov chain Monte Carlo

Nonparametric Bayesian model selection and averaging

TLDR
A general theorem is presented on the rate of contraction of the resulting posterior distribution as n, which gives conditions under which the rates depend in a complicated way on the priors, but also that the rate is fairly robust to specification of the prior weights.

Convergence rates of posterior distributions for non-i.i.d. observations

We consider the asymptotic behavior of posterior distributions and Bayes estimators based on observations which are required to be neither independent nor identically distributed. We give general

Rates of contraction of posterior distributions based on Gaussian process priors

TLDR
The rate of contraction of the posterior distribution based on sampling from a smooth density model when the prior models the log density as a (fractionally integrated) Brownian motion is shown to depend on the position of the true parameter relative to the reproducing kernel Hilbert space of the Gaussian process.

On universal Bayesian adaptation

We consider estimating a probability density p based on a random sample from this density by a Bayesian approach. The prior is constructed in two steps, by first constructing priors on a collection

Rates of convergence for the posterior distributions of mixtures of betas and adaptive nonparamatric estimation of the density

In this work we investigate the asymptotic properties of nonparametric bayesian mixtures of Betas for estimating a smooth density on [0,1]. We consider a parameterisation of Betas distributions in

Convergence rates of posterior distributions

We consider the asymptotic behavior of posterior distributions and Bayes estimators for infinite-dimensional statistical models. We give general results on the rate of convergence of the posterior