Machine Learning Econometrics: Bayesian Algorithms and Methods

  title={Machine Learning Econometrics: Bayesian Algorithms and Methods},
  author={Dimitris Korobilis and Davide Pettenuzzo},
  journal={Machine Learning eJournal},
Bayesian inference in economics is primarily perceived as a methodology for cases where the data are short, that is, not informative enough in order to be able to obtain reliable econometric estimates of quantities of interest. In these cases, prior beliefs, such as the experience of the decision-maker or results from economic theory, can be explicitly incorporated to the econometric estimation problem and enhance the desired solution. In contrast, in fields such as computing science and… 

Approximate Bayesian forecasting

Variational Inference: A Review for Statisticians

Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.

High-Dimensional Macroeconomic Forecasting Using Message Passing Algorithms

A generalized approximate message passing algorithm is derived that has low algorithmic complexity and is trivially parallelizable and can be used to estimate time-varying parameter regressions with arbitrarily large number of exogenous predictors.

Bayesian Inference in Econometric Models Using Monte Carlo Integration

Methods for the systematic application of Monte Carlo integration with importance sampling to Bayesian inference are developed. Conditions under which the numerical approximation converges almost

Inference from Iterative Simulation Using Multiple Sequences

The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.

Big Learning with Bayesian Methods

A survey of the recent advances in Big learning with Bayesian methods, termed Big Bayesian Learning, including nonparametric Bayesian Methods for adaptively inferring model complexity, regularized Bayesian inference for improving the flexibility via posterior regularization, and scalable algorithms and systems based on stochastic subsampling and distributed computing for dealing with large-scale applications.

On Markov chain Monte Carlo methods for tall data

An original subsampling-based approach is proposed which samples from a distribution provably close to the posterior distribution of interest, yet can require less than $O(n)$ data point likelihood evaluations at each iteration for certain statistical models in favourable scenarios.

Adaptive Hierarchical Priors for High-Dimensional Vector Autoregressions

A simulation-free estimation algorithm for vector autoregressions (VARs) that allows fast approximate calculation of marginal parameter posterior distributions and can be used for structural analysis and that it can successfully replicate important features of news-driven business cycles predicted by a large-scale theoretical model.

EMVS: The EM Approach to Bayesian Variable Selection

EMVS is proposed, a deterministic alternative to stochastic search based on an EM algorithm which exploits a conjugate mixture prior formulation to quickly find posterior modes in high-dimensional linear regression contexts.

Patterns of Scalable Bayesian Inference

This paper seeks to identify unifying principles, patterns, and intuitions for scaling Bayesian inference by reviewing existing work on utilizing modern computing resources with both MCMC and variational approximation techniques.