Learn More
We consider the asymptotic behaviour of posterior distributions and Bayes estimators for innnite-dimensional statistical models. We give general results on the rate of convergence of the posterior measure. These are applied to several examples, including priors on nite sieves, log-spline models, Dirichlet processes and interval censoring.
We consider the problem of testing monotonicity of the regression function in a standard nonparametric regression model. We introduce test statistics that are functionals of a certain natural U-process. We study the limiting distribution of these test statistics through strong approximation methods and the extreme value theory for Gaussian processes. We(More)
We study the rates of convergence of the maximum likelihood esti-mator (MLE) and posterior distribution in density estimation problems, where the densities are location or location-scale mixtures of normal distributions with the scale parameter lying between two positive numbers. The true density is also assumed to lie in this class with the true mixing(More)
We study consistency and asymptotic normality of posterior distributions of the natural parameter for an exponential family when the dimension of the parameter grows with the sample size. Under certain growth restrictions on the dimension, we show that the posterior distributions concentrate in neighbourhoods of the true parameter and can be approximated by(More)
Consider binary observations whose response probability is an unknown smooth function of a set of covariates. Suppose that a prior on the response probability function is induced by a Gaussian process mapped to the unit interval through a link function. In this paper we study consistency of the resulting posterior distribution. If the covariance kernel has(More)
We consider nonparametric Bayesian estimation of a probability density p based on a random sample of size n from this density using a hierarchical prior. The prior consists, for instance, of prior weights on the regularity of the unknown density combined with priors that are appropriate given that the density has this regularity. More generally, the(More)
We study the rates of convergence of the posterior distribution for Bayesian density estimation with Dirichlet mixtures of normal distributions as the prior. The true density is assumed to be twice continuously differentiable. The bandwidth is given a sequence of priors which is obtained by scaling a single prior by an appropriate order. In order to handle(More)
We consider the asymptotic behavior of posterior distributions and Bayes estimators based on observations which are required to be neither independent nor identically distributed. We give general results on the rate of convergence of the posterior measure relative to distances derived from a testing criterion. We then specialize our results to independent,(More)
Positivity of the prior probability of Kullback-Leibler neighborhood around the true density, commonly known as the Kullback-Leibler property, plays a fundamental role in posterior consistency. A popular prior for Bayesian estimation is given by a Dirichlet mixture, where the kernels are chosen depending on the sample space and the class of densities to be(More)
We consider the problem of estimating a sparse precision matrix of a multivariate Gaussian distribution, including the case where the dimension p exceeds the sample size n. Gaussian graphical models provide an important tool in describing conditional independence through presence or absence of the edges in the underlying graph. A popular non-Bayesian method(More)