Gaussian Process-Mixture Conditional Heteroscedasticity

  title={Gaussian Process-Mixture Conditional Heteroscedasticity},
  author={Emmanouil Antonios Platanios and Sotirios P. Chatzis},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
Generalized autoregressive conditional heteroscedasticity (GARCH) models have long been considered as one of the most successful families of approaches for volatility modeling in financial return series. In this paper, we propose an alternative approach based on methodologies widely used in the field of statistical machine learning. Specifically, we propose a novel nonparametric Bayesian mixture of Gaussian process regression models, each component of which models the noise variance process… 

Figures and Tables from this paper

Recurrent latent variable conditional heteroscedasticity

  • S. Chatzis
  • Computer Science
    2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2017
This paper introduces a recurrent latent variable model, capable of capturing highly flexible functional relationships for the variances, and derives a fast, scalable, and robust to overfitting Bayesian inference algorithm, by relying on amortized variational inference.

Generalized Multi-Output Gaussian Process Censored Regression

Simultaneous clustering and feature selection via nonparametric Pitman–Yor process mixture models

An efficient algorithm for model inference is developed, based on the collapsed variational Bayes framework with 0th-order Taylor approximation, and the merits and efficacy of the proposed nonparametric Bayesian model are demonstrated via challenging applications that concern real-world data clustering and 3D objects recognition.

A Variable Order Hidden Markov Model with Dependence Jumps

This paper proposes a novel HMM formulation, treating temporal dependencies as latent variables over which inference is performed, and allows for increased modeling and predictive performance compared to the alternative methods, while offering a good trade-off between the resulting increases in predictive performance and computational complexity.

Indian Buffet Process Deep Generative Models for Semi-Supervised Classification

  • S. Chatzis
  • Computer Science
    2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2018
This paper proposes a novel DGM formulation, based on the imposition of an IBP prior, and devise an efficient Black-Box Variational inference algorithm for this model, and exhibits its efficacy in a number of semi-supervised classification experiments.

Flexible weighted dirichlet process mixture modelling and evaluation to address the problem of forecasting return distribution

A flexible semiparametric Bayesian framework to address the problem of forecasting volatility in time series data by introducing the weighted Dirichlet process mixture (WDPM).

The Hard-Cut EM Algorithm for Mixture of Sparse Gaussian Processes

This paper refine the MSGP model and develop the hard-cut EM algorithm for MSGP from its original version for MGP, and shows that with sparse technique, the parameter learning of the proposed MS GP model is much more efficient than that of the MGP model.



Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation

Traditional econometric models assume a constant one-period forecast variance. To generalize this implausible assumption, a new class of stochastic processes called autoregressive conditional

Multivariate Mixed Normal Conditional Heteroskedasticity

A new multivariate volatility model where the conditional distribution of a vector time series is given by a mixture of multivariate normal distributions is proposed and some theoretical properties of the model such as the unconditional covariance matrix and autocorrelations of squared returns are derived.

Copula Processes

A stochastic volatility model, Gaussian Copula Process Volatility (GCPV), is developed, which can outperform GARCH on simulated and financial data, and incorporate covariates other than time, and model a rich class of covariance structures.

Normal Mixture Garch(1,1): Applications to Exchange Rate Modelling

Some recent specifications for GARCH error processes explicitly assume a conditional variance that is generated by a mixture of normal components, albeit with some parameter restrictions. This paper

Signal Modeling and Classification Using a Robust Latent Space Model Based on $t$ Distributions

A Bayesian approach to factor analysis modeling based on Student's-t distributions is developed, which provides an efficient and more robust alternative to EM-based methods, resolving their singularity and overfitting proneness problems, while allowing for the automatic determination of the optimal model size.

Variational Heteroscedastic Gaussian Process Regression

This work presents a non-standard variational approximation that allows accurate inference in heteroscedastic GPs (i.e., under input-dependent noise conditions) and its effectiveness is illustrated on several synthetic and real datasets of diverse characteristics.

Asymmetric multivariate normal mixture GARCH

Variational inference for Dirichlet process mixtures

A variational inference algorithm forDP mixtures is presented and experiments that compare the algorithm to Gibbs sampling algorithms for DP mixtures of Gaussians and present an application to a large-scale image analysis problem are presented.

An Alternative Infinite Mixture Of Gaussian Process Experts

An infinite mixture model in which each component comprises a multivariate Gaussian distribution over an input space, and a Gaussian Process model over an output space, which leads to a more powerful and consistent Bayesian specification of the effective 'gating network' for the different experts.

Generalized autoregressive conditional heteroskedasticity