Bayesian analysis of mixture autoregressive models covering the complete parameter space

  title={Bayesian analysis of mixture autoregressive models covering the complete parameter space},
  author={Davide Ravagli and Georgi N. Boshnakov},
  journal={Computational Statistics},
Mixture autoregressive (MAR) models provide a flexible way to model time series with predictive distributions which depend on the recent history of the process and are able to accommodate asymmetry and multimodality. Bayesian inference for such models offers the additional advantage of incorporating the uncertainty in the estimated models into the predictions. We introduce a new way of sampling from the posterior distribution of the parameters of MAR models which allows for covering the… 
Autoregressive density modeling with the Gaussian process mixture transition distribution
The proposed model provides a simple, yet flexible framework that preserves useful and distinguishing characteristics of the MTD model class, and demonstrates approximation of lag‐dependent transition densities and model selection with two simulated and two real time series.
A signed power transformation with application to white noise testing
We show that signed power transforms of some ARCH-type processes give ARCHtype processes. The class of ARCH-type models for which this property holds contains many common ARCH and GARCH models. The


Bayesian mixture of autoregressive models
Bayesian Mixtures of Autoregressive Models
A class of time-domain models for analyzing possibly nonstationary time series formed as a mixture of time series models, whose mixing weights are a function of time.
In this thesis we consider some finite mixture time series models in which each component is following a well-known process, e.g. AR, ARMA or ARMA-GARCH process, with either normal-type errors or
On a mixture autoregressive model
The Gaussian mixture transition distribution model is generalized to the mixture autoregressive (MAR) model for the modelling of non‐linear time series and appears to capture features of the data better than other competing models do.
Bayesian analysis of mixture of autoregressive components with an application to financial market volatility
In this paper, we present a fully Bayesian analysis of a finite mixture of autoregressive components. Neither the number of mixture components nor the autoregressive order of each component have to
On Bayesian Analysis of Mixtures with an Unknown Number of Components (with discussion)
New methodology for fully Bayesian mixture analysis is developed, making use of reversible jump Markov chain Monte Carlo methods that are capable of jumping between the parameter subspaces
Estimation of Finite Mixture Distributions Through Bayesian Sampling
SUMMARY A formal Bayesian analysis of a mixture model usually leads to intractable calculations, since the posterior distribution takes into account all the partitions of the sample. We present
Marginal Likelihood From the Metropolis–Hastings Output
The proposed method is developed in the context of MCMC chains produced by the Metropolis–Hastings algorithm, whose building blocks are used both for sampling and marginal likelihood estimation, thus economizing on prerun tuning effort and programming.
Reversible jump Markov chain Monte Carlo computation and Bayesian model determination
Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some fixed
Marginal Likelihood from the Gibbs Output
  • S. Chib
  • Computer Science, Mathematics
  • 1995
This work exploits the fact that the marginal density can be expressed as the prior times the likelihood function over the posterior density, so that Bayes factors for model comparisons can be routinely computed as a by-product of the simulation.