Gaussian Process-Mixture Conditional Heteroscedasticity

  title={Gaussian Process-Mixture Conditional Heteroscedasticity},
  author={Emmanouil Antonios Platanios and S. Chatzis},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
Generalized autoregressive conditional heteroscedasticity (GARCH) models have long been considered as one of the most successful families of approaches for volatility modeling in financial return series. In this paper, we propose an alternative approach based on methodologies widely used in the field of statistical machine learning. Specifically, we propose a novel nonparametric Bayesian mixture of Gaussian process regression models, each component of which models the noise variance process… Expand
Recurrent latent variable conditional heteroscedasticity
  • S. Chatzis
  • Mathematics, Computer Science
  • 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2017
This paper introduces a recurrent latent variable model, capable of capturing highly flexible functional relationships for the variances, and derives a fast, scalable, and robust to overfitting Bayesian inference algorithm, by relying on amortized variational inference. Expand
A hidden Markov model with dependence jumps for predictive modeling of multidimensional time-series
A novel HMM formulation, treating temporal dependencies as latent variables over which inference is performed, allowing for effectively modeling non-homogeneous observed data, where the patterns of the entailed temporal dynamics may change over time. Expand
Simultaneous clustering and feature selection via nonparametric Pitman–Yor process mixture models
An efficient algorithm for model inference is developed, based on the collapsed variational Bayes framework with 0th-order Taylor approximation, and the merits and efficacy of the proposed nonparametric Bayesian model are demonstrated via challenging applications that concern real-world data clustering and 3D objects recognition. Expand
A Variable Order Hidden Markov Model with Dependence Jumps
This paper proposes a novel HMM formulation, treating temporal dependencies as latent variables over which inference is performed, and allows for increased modeling and predictive performance compared to the alternative methods, while offering a good trade-off between the resulting increases in predictive performance and computational complexity. Expand
Indian Buffet Process Deep Generative Models for Semi-Supervised Classification
  • S. Chatzis
  • Computer Science
  • 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2018
This paper proposes a novel DGM formulation, based on the imposition of an IBP prior, and devise an efficient Black-Box Variational inference algorithm for this model, and exhibits its efficacy in a number of semi-supervised classification experiments. Expand
Flexible weighted dirichlet process mixture modelling and evaluation to address the problem of forecasting return distribution
ABSTRACT Forecasting volatility has been widely addressed in the fields of finance, environmetrics, and other areas involving massive time series. The important part of addressing this problem is howExpand
Enhanced Gaussian process mixture model for short-term electric load forecasting
Abstract This research used a hard-cut iterative training algorithm to improve a Gaussian process mixture (GPM) model. Our enhanced GPM (EGPM) concisely estimates distribution parameters to theExpand
Deep learning with t-exponential Bayesian kitchen sinks
This paper examines novel deep network architectures, where each layer comprises a bank of arbitrary nonlinearities, linearly combined using multiple alternative sets of weights, and adopts the t-exponential family of distributions since it can more flexibly accommodate real-world data. Expand
The Hard-Cut EM Algorithm for Mixture of Sparse Gaussian Processes
This paper refine the MSGP model and develop the hard-cut EM algorithm for MSGP from its original version for MGP, and shows that with sparse technique, the parameter learning of the proposed MS GP model is much more efficient than that of the MGP model. Expand
Remaining Useful Life Prediction for Lithium-Ion Batteries Based on Gaussian Processes Mixture
A novel RUL prediction method based on the Gaussian Process Mixture that can process multimodality by fitting different segments of trajectories with different GPR models separately, such that the tiny differences among these segments can be revealed. Expand


Mixed Normal Conditional Heteroskedasticity
Both unconditional mixed normal distributions and GARCH models with fat-tailed conditional distributions have been employed in the literature for modeling financial data. We consider a mixed normalExpand
Multivariate mixed normal conditional heteroskedasticity
A new multivariate volatility model where the conditional distribution of a vector time series is given by a mixture of multivariate normal distributions is proposed and some theoretical properties of the model such as the unconditional covariance matrix and autocorrelations of squared returns are derived. Expand
Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation
Traditional econometric models assume a constant one-period forecast variance. To generalize this implausible assumption, a new class of stochastic processes called autoregressive conditionalExpand
Copula Processes
A stochastic volatility model, Gaussian Copula Process Volatility (GCPV), is developed, which can outperform GARCH on simulated and financial data, and incorporate covariates other than time, and model a rich class of covariance structures. Expand
Normal Mixture GARCH(1,1): Applications to Exchange Rate Modelling
Some recent specifications for GARCH error processes explicitly assume a conditional variance that is generated by a mixture of normal components, albeit with some parameter restrictions. This paperExpand
Signal Modeling and Classification Using a Robust Latent Space Model Based on $t$ Distributions
A Bayesian approach to factor analysis modeling based on Student's-t distributions is developed, which provides an efficient and more robust alternative to EM-based methods, resolving their singularity and overfitting proneness problems, while allowing for the automatic determination of the optimal model size. Expand
Variational Heteroscedastic Gaussian Process Regression
This work presents a non-standard variational approximation that allows accurate inference in heteroscedastic GPs (i.e., under input-dependent noise conditions) and its effectiveness is illustrated on several synthetic and real datasets of diverse characteristics. Expand
Dynamic Conditional Correlation
Time varying correlations are often estimated with multivariate generalized autoregressive conditional heteroskedasticity (GARCH) models that are linear in squares and cross products of the data. AExpand
Asymmetric multivariate normal mixture GARCH
In an application to stock market returns, it is shown that the disaggregation of the conditional (co)variance process generated by the model provides substantial intuition and exhibits a strong performance in calculating out-of-sample Value-at-Risk measures. Expand
Variational inference for Dirichlet process mixtures
Dirichlet process (DP) mixture models are the cornerstone of non- parametric Bayesian statistics, and the development of Monte-Carlo Markov chain (MCMC) sampling methods for DP mixtures has enabledExpand