Focused Bayesian prediction

@article{LoaizaMaya2019FocusedBP,
  title={Focused Bayesian prediction},
  author={Rub'en Loaiza-Maya and Gael M. Martin and David T. Frazier},
  journal={Journal of Applied Econometrics},
  year={2019}
}
We propose a new method for conducting Bayesian prediction that delivers accurate predictions without correctly specifying the unknown true data generating process. A prior is defined over a class of plausible predictive models. After observing data, we update the prior to a posterior over these models, via a criterion that captures a user-specified measure of predictive accuracy. Under regularity, this update yields posterior concentration onto the element of the predictive class that… 

Figures and Tables from this paper

Generalized Bayesian Likelihood-Free Inference Using Scoring Rules Estimators

A framework for Bayesian Likelihood-Free Inference based on Generalized Bayesian Inference using scoring rules (SRs) is proposed and it is proved finite sample posterior consistency and outlier robustness of the authors' posterior for the Kernel and Energy Scores are proved.

Optimal probabilistic forecasts for risk management

This paper explores the implications of producing forecast distributions that are optimized according to scoring rules that are relevant to financial risk management. We assess the predictive

Bayesian Forecasting in the 21st Century: A Modern Review

The Bayesian statistical paradigm provides a principled and coherent approach to probabilistic forecasting. Uncertainty about all unknowns that characterize any forecasting problem – model,

Loss-Based Variational Bayes Prediction

A new approach to Bayesian prediction that caters for models with a large number of parameters and is robust to model misspecification is proposed and Applications to both simulated and empirical data using high-dimensional Bayesian neural network and autoregressive mixture models demonstrate that the approach provides more accurate results than various alternatives, including misspecialized likelihood-based predictions.

Computing Bayes: Bayesian Computation from 1763 to the 21st Century

This paper takes the reader on a chronological tour of Bayesian computation over the past two and a half centuries, and place all computational problems into a common framework, and describe all computational methods using a common notation.

Reliable Bayesian Inference in Misspecified Models

This work provides a general solution to a fundamental open problem in Bayesian inference, namely poor uncertainty quantification, from a frequency standpoint, of Bayesian methods in misspecified models by replacing the usual posterior with an intuitive approximate posterior.

Approximating Bayes in the 21st Century

The aim is to help new researchers in particular -- and more generally those interested in adopting a Bayesian approach to empirical work -- distinguish between different approximate techniques; understand the sense in which they are approximate; appreciate when and why particular methods are useful; and see the ways inWhich they can can be combined.

References

SHOWING 1-10 OF 78 REFERENCES

A general framework for updating belief distributions

It is argued that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case.

An MCMC Approach to Classical Estimation

Strictly Proper Scoring Rules, Prediction, and Estimation

The theory of proper scoring rules on general probability spaces is reviewed and developed, and the intuitively appealing interval score is proposed as a utility function in interval estimation that addresses width as well as coverage.

Gibbs posterior for variable selection in high-dimensional classification and data mining

A completely new direction will be considered here to study BVS with a Gibbs posterior originating in statistical mechanics, and a convenient Markov chain Monte Carlo algorithm is developed to implement B VS with the Gibbs posterior.

General Bayesian updating and the loss-likelihood bootstrap

In this paper, we revisit the weighted likelihood bootstrap and show that it is well-motivated for Bayesian inference under misspecified models. We extend the underlying idea to a wider family of

Merging of Opinions with Increasing Information

Bayesian Nonparametric Calibration and Combination of Predictive Distributions

ABSTRACT We introduce a Bayesian approach to predictive density calibration and combination that accounts for parameter uncertainty and model set incompleteness through the use of random calibration

Approximate Bayesian forecasting

...