Gibbs posterior convergence and the thermodynamic formalism

  title={Gibbs posterior convergence and the thermodynamic formalism},
  author={Kevin Mcgoff and Sayan Mukherjee and Andrew B. Nobel},
  journal={The Annals of Applied Probability},
In this paper we consider a Bayesian framework for making inferences about dynamical systems from ergodic observations. The proposed Bayesian procedure is based on the Gibbs posterior, a decision theoretic generalization of standard Bayesian inference. We place a prior over a model class consisting of a parametrized family of Gibbs measures on a mixing shift of finite type. This model class generalizes (hidden) Markov chain models by allowing for long range dependencies, including Markov chains… 

A Large Deviation Approach to Posterior Consistency in Dynamical Systems

It is shown that the generalized posterior distribution concentrates asymptotically on those parameters that minimize the expected loss and a divergence term, hence proving posterior consistency.

Bayes Posterior Convergence for Loss Functions via Almost Additive Thermodynamic Formalism

In the case of direct observation and almost-additive loss functions, it is proved an exponential convergence of the a posteriori measures to a limit measure.

62C12 Empirical decision procedures; empirical Bayes procedures 62E10 Characterization and structure theory of statistical distributions

In the case of direct observation and almost-additive loss functions, it is proved an exponential convergence of the a posteriori measures to a limit measure.

Dynamical hypothesis tests and decision theory for Gibbs distributions

We consider the problem of testing for two Gibbs probabilities µ 0 and µ 1 defined for a dynamical system (Ω , T ). Due to the fact that in general full orbits are not observable or computable, one

Optimal Transport for Stationary Markov Chains via Policy Iteration

It is proved that solutions can be obtained via the policy iteration algorithm via the optimal transition coupling problem, in which the optimal transport problem is constrained to the set of stationary Markovian couplings satisfying a certain transition matrix condition.

Decision Theory and Large Deviations for Dynamical Hypotheses Tests: Neyman-Pearson, Min-Max and Bayesian Tests

We analyze hypotheses tests using classical results on large deviations to compare two models, each one described by a different Hölder Gibbs probability measure. One main difference to the classical

Estimation of Stationary Optimal Transport Plans

We study optimal transport for stationary stochastic processes taking values in finite spaces. In order to reflect the stationarity of the underlying processes, we restrict attention to stationary



Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors

Bayesian analysis of data from the general linear mixed model is challenging because any nontrivial prior leads to an intractable posterior density. However, if a conditionally conjugate prior

Adaptive particle-based approximations of the Gibbs posterior for inverse problems

A sequential Monte Carlo approach to approximate the Gibbs posterior using particles and a recently developed local reduced basis method to build an efficient surrogate loss function that is used in the Gibbs update formula in place of the true loss.

Posterior consistency for partially observed Markov models

Block Gibbs Sampling for Bayesian Random Effects Models With Improper Priors: Convergence and Regeneration

Bayesian versions of the classical one-way random effects model are widely used to analyze data. If the standard diffuse prior is adopted, there is a simple block Gibbs sampler that can be employed

Convergence properties of Gibbs samplers for Bayesian probit regression with proper priors

The Bayesian probit regression model (Albert and Chib (1993)) is popular and widely used for binary regression. While the improper flat prior for the regression coefficients is an appropriate choice

Dynamics of Bayesian Updating with Dependent Data and Misspecified Models

This work establishes sufficient conditions for posterior convergence when all hypotheses are wrong, and the data have complex dependencies, and derives a kind of large deviations principle for the posterior measure, extending in some cases to rates of convergence, and discusses the advantages of predicting using a combination of models known to be wrong.

Robust Bayesian Inference via Coarsening

This work introduces a novel approach to Bayesian inference that improves robustness to small departures from the model: rather than conditioning on the event that the observed data are generated by the model, one conditions on theevent that the model generates data close to the observedData, in a distributional sense.

A general framework for updating belief distributions

It is argued that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case.

Exact sampling for intractable probability distributions via a Bernoulli factory

This work provides a significant reduction to the number of input draws necessary for the Bernoulli factory, which enables exact sampling via a rejection sampling approach.

Variational Bayes and a problem of reliable communication: II. Infinite systems

We consider a family of estimation problems not admitting conventional analysis because of singularity and measurability issues. We define posterior distributions for the family by a variational