A general framework for updating belief distributions

@article{Bissiri2013AGF,
  title={A general framework for updating belief distributions},
  author={Pier Giovanni Bissiri and Chris C. Holmes and Stephen G. Walker},
  journal={Journal of the Royal Statistical Society. Series B, Statistical Methodology},
  year={2013},
  volume={78},
  pages={1103 - 1130}
}
We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data‐generating mechanism. For instance, when the object of interest is low dimensional, such… 

Bayesian inference using loss functions

A Bayesian non-parametric approach that generalizes the Bayesian bootstrap, and specifies a Dirichlet process model for the distribution of the observables, and shows that the developed non-standard Bayesian updating procedures yield valid posterior distributions in terms of consistency and asymptotic normality under model mis-specification.

Generalised Bayes Updates with $f$-divergences through Probabilistic Classifiers

This work considers the behavior of generalized belief updates for various specific choices under the $f$-divergence family and shows that for specific divergence functions such an approach can even improve on methods evaluating the correct model likelihood function analytically.

Bayesian Model Calibration for Extrapolative Prediction via Gibbs Posteriors.

G Gibbs posteriors is introduced as an alternative Bayesian method for model calibration, which updates the prior with a loss function connecting the data to the parameter, and a bootstrap implementation for approximating coverage rates is presented.

Robust Bayesian Inference via Coarsening

This work introduces a novel approach to Bayesian inference that improves robustness to small departures from the model: rather than conditioning on the event that the observed data are generated by the model, one conditions on theevent that the model generates data close to the observedData, in a distributional sense.

Adaptive particle-based approximations of the Gibbs posterior for inverse problems

A sequential Monte Carlo approach to approximate the Gibbs posterior using particles and a recently developed local reduced basis method to build an efficient surrogate loss function that is used in the Gibbs update formula in place of the true loss.

Bayesian fractional posteriors

We consider the fractional posterior distribution that is obtained by updating a prior distribution via Bayes theorem with a fractional likelihood function, a usual likelihood function raised to a

Scaling the Gibbs posterior credible regions

In the important quantile regres- sion problem, it is shown numerically that the Gibbs posterior credible intervals, with scale selected by the GPS algorithm, are exactly calibrated and are more efficient than those obtained via other Bayesian-like methods.

A Robust Bayesian Exponentially Tilted Empirical Likelihood Method

A new Bayesian approach for analysing moment condition models in the situation where the data may be contaminated by outliers, and it is found that the proposed methodology produces reliable posterior inference for the fundamental relationships that are embedded in the majority of the data, even when outliers are present.

Bayes Posterior Convergence for Loss Functions via Almost Additive Thermodynamic Formalism

In the case of direct observation and almost-additive loss functions, it is proved an exponential convergence of the a posteriori measures to a limit measure.

From robust tests to Bayes-like posterior distributions

In the Bayes paradigm and for a given loss function, we propose the construction of a new type of posterior distributions for estimating the law of an n-sample. The loss functions we have in mind are
...

References

SHOWING 1-10 OF 79 REFERENCES

On Bayesian learning via loss functions

Bayesian asymptotics with misspecified models

In this paper, we study the asymptotic properties of a sequence of poste- rior distributions based on an independent and identically distributed sample and when the Bayesian model is misspecified. We

Bayesian sandwich posteriors for pseudo-true parameters

A PAC analysis of a Bayesian estimator

The paper uses the techniques to give the first PAC style analysis of a Bayesian inspired estimator of generalisation, the size of a ball which can be placed in the consistent region of parameter space, and the resulting bounds are independent of the complexity of the function class though they depend linearly on the dimensionality of the parameter space.

The Selection of Prior Distributions by Formal Rules

Abstract Subjectivism has become the dominant philosophical foundation for Bayesian inference. Yet in practice, most Bayesian analyses are performed with so-called “noninformative” priors, that is,

Interpreting statistical evidence by using imperfect models: robust adjusted likelihood functions

It is shown that in some important examples the Bayes posterior probability distribution based on the adjusted likelihood is robust, remaining correct asymptotically even when the model for the observable random variable does not include the true distribution.

Asymptotic behavior of Bayes estimates under possibly incorrect models

We prove that the posterior distribution in a possibly incorrect parametric model a.s. concentrates in a strong sense on the set of pseudotrue parameters determined by the true distribution. As a

On Bayesian Learning from Bernoulli Observations

Power prior distributions for regression models

We propose a general class of prior distributions for arbitrary regression models. We discuss parametric and semiparametric models. The prior specification for the regression coefficients focuses on

Gibbs posterior for variable selection in high-dimensional classification and data mining

A completely new direction will be considered here to study BVS with a Gibbs posterior originating in statistical mechanics, and a convenient Markov chain Monte Carlo algorithm is developed to implement B VS with the Gibbs posterior.
...