Bayesian fractional posteriors

  title={Bayesian fractional posteriors},
  author={Anirban Bhattacharya and Debdeep Pati and Yun Yang},
  journal={The Annals of Statistics},
We consider the fractional posterior distribution that is obtained by updating a prior distribution via Bayes theorem with a fractional likelihood function, a usual likelihood function raised to a fractional power. First, we analyze the contraction property of the fractional posterior in a general misspecified framework. Our contraction results only require a prior mass condition on certain Kullback-Leibler (KL) neighborhood of the true parameter (or the KL divergence minimizer in the… 
Bayesian model selection consistency and oracle inequality with intractable marginal likelihood
In this article, we investigate large sample properties of model selection procedures in a general Bayesian framework when a closed form expression of the marginal likelihood function is not
MMD-Bayes: Robust Bayesian Estimation via Maximum Mean Discrepancy
A pseudo-likelihood based on the Maximum Mean Discrepancy, defined via an embedding of probability distributions into a reproducing kernel Hilbert space is built, and it is shown that this MMD-Bayes posterior is consistent and robust to model misspecification.
Statistical Inference in Mean-Field Variational Bayes
We conduct non-asymptotic analysis on the mean-field variational inference for approximating posterior distributions in complex Bayesian models that may involve latent variables. We show that the
Equivalence of Convergence Rates of Posterior Distributions and Bayes Estimators for Functions and Nonparametric Functionals
We study the posterior contraction rates of a Bayesian method with Gaussian process priors in nonparametric regression and its plug-in property for differential operators. For a general class of
Probably approximate Bayesian computation: nonasymptotic convergence of ABC under misspecification
This paper develops theoretical bounds for the distance between the statistics used in ABC and shows that some versions of ABC are inherently robust to misspecification, and proposes a sequential Monte Carlo to sample from the pseudo-posterior, improving upon the state of the art samplers.
Optimal Bayesian estimation of Gaussian mixtures with growing number of components
We study posterior concentration properties of Bayesian procedures for estimating finite Gaussian mixtures in which the number of components is unknown and allowed to grow with the sample size. Under
A comparison of learning rate selection methods in generalized Bayesian inference
Generalized Bayes posterior distributions are formed by putting a fractional power on the likelihood before combining with the prior via Bayes’s formula. This fractional power, which is often viewed
PAC-Bayes Bounds on Variational Tempered Posteriors for Markov Models
This paper presents a Probably Approximately Correct (PAC)-Bayesian analysis of variational Bayes (VB) approximations to tempered Bayesian posterior distributions, bounding the model risk of the VB approxims to the mixing and ergodic properties of the Markov model.
On the Robustness to Misspecification of $\alpha$-Posteriors and Their Variational Approximations
A Bernstein-von Mises theorem is derived showing convergence in total variation distance of α-posteriors and their variational approximations to limiting Gaussian distributions to evaluate the Kullback-Leibler divergence between true and reported posteriors.
Posterior contraction in sparse generalized linear models
We study posterior contraction rates in sparse high-dimensional generalized linear models using priors incorporating sparsity. A mixture of a point mass at zero and a continuous distribution is used


We consider a nonparametric Bayesian model for conditional densities. The model is a finite mixture of normal distributions with covariate dependent multinomial logit mixing probabilities. A prior
Bayesian asymptotics with misspecified models
In this paper, we study the asymptotic properties of a sequence of poste- rior distributions based on an independent and identically distributed sample and when the Bayesian model is misspecified. We
On Posterior Concentration in Misspecified Models
We investigate the asymptotic behavior of Bayesian posterior distributions under independent and identically distributed ($i.i.d.$) misspecified models. More specifically, we study the concentration
A general framework for updating belief distributions
It is argued that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case.
Robust Bayesian Inference via Coarsening
This work introduces a novel approach to Bayesian inference that improves robustness to small departures from the model: rather than conditioning on the event that the observed data are generated by the model, one conditions on theevent that the model generates data close to the observedData, in a distributional sense.
Bayesian nonparametric multivariate convex regression
In many applications, such as economics, operations research and reinforcement learning, one often needs to estimate a multivariate regression function f subject to a convexity constraint. For
Posterior convergence rates of Dirichlet mixtures at smooth densities
We study the rates of convergence of the posterior distribution for Bayesian density estimation with Dirichlet mixtures of normal distributions as the prior. The true density is assumed to be twice
Optimal Bayesian posterior concentration rates with empirical priors
In high-dimensional Bayesian applications, choosing a prior such that the corresponding posterior distribution has optimal asymptotic concentration properties can be restrictive in the sense that the
Misspecification in infinite-dimensional Bayesian statistics
We consider the asymptotic behavior of posterior distributions if the model is misspecified. Given a prior distribution and a random sample from a distribution P 0 , which may not be in the support
A PAC analysis of a Bayesian estimator
The paper uses the techniques to give the first PAC style analysis of a Bayesian inspired estimator of generalisation, the size of a ball which can be placed in the consistent region of parameter space, and the resulting bounds are independent of the complexity of the function class though they depend linearly on the dimensionality of the parameter space.