# Learning Predictive Leading Indicators for Forecasting Time Series Systems with Unknown Clusters of Forecast Tasks

@inproceedings{Gregorov2017LearningPL, title={Learning Predictive Leading Indicators for Forecasting Time Series Systems with Unknown Clusters of Forecast Tasks}, author={Magda Gregorov{\'a} and Alexandros Kalousis and St{\'e}phane Marchand-Maillet}, booktitle={ACML}, year={2017} }

We present a new method for forecasting systems of multiple interrelated time series. The method learns the forecast models together with discovering leading indicators from within the system that serve as good predictors improving the forecast accuracy and a cluster structure of the predictive tasks around these. The method is based on the classical linear vector autoregressive model (VAR) and links the discovery of the leading indicators to inferring sparse graphs of Granger causality. We…

## Figures, Tables, and Topics from this paper

## 3 Citations

seq2graph: Discovering Dynamic Dependencies from Multivariate Time Series with Multi-level Attention

- Computer Science, MathematicsArXiv
- 2018

This work presents a novel deep learning model that uses multiple layers of customized gated recurrent units (GRUs) for discovering both time lagged behaviors as well as inter-timeseries dependencies in the form of directed weighted graphs and shows how it is able to capture these dependency behaviors via intuitive and interpretable dependency graphs and use them to generate highly accurate forecasts.

seq2graph: Discovering Dynamic Non-linear Dependencies from Multivariate Time Series

- Computer Science2019 IEEE International Conference on Big Data (Big Data)
- 2019

This work presents a novel deep learning model that uses multiple layers of adapted gated recurrent units (GRUs) for discovering both time lagged behaviors and inter-timeseries dependencies, representing them in the form of directed weighted graphs and shows how it is capable of capturing these dependency behaviors via intuitive and interpretable dependency graphs and use them to generate forecasting values.

Accurate Demand Forecasting for Retails with Deep Neural Networks

- Computer ScienceEDBT
- 2020

A deep learning-based prediction model is proposed to find inherent inter-dependencies and temporal characteristics among product items for more accurate prediction and can achieve much higher accuracy compared with state-of-the-art methods.

## References

SHOWING 1-10 OF 27 REFERENCES

Learning Bi-clustered Vector Autoregressive Models

- Computer ScienceECML/PKDD
- 2012

This work develops a methodology that combines sparse learning and a nonparametric bi-clustered prior over the VAR model, conducting full Bayesian inference via blocked Gibbs sampling and demonstrating improvements in both model estimation and clustering quality over standard alternatives.

Sparse autoregressive model estimation for learning granger causality in time series

- Mathematics, Computer Science2013 IEEE International Conference on Acoustics, Speech and Signal Processing
- 2013

A recent powerful algorithm, namely, the alternating direction method of multipliers (ADMM) is applied for solving topology selection problems in Granger graphical models of AR processes and is applied on Google Flu Trends data to learn a causal structure of flu activities in the 51 states of the USA.

Learning the Dependence Graph of Time Series with Latent Factors

- Computer Science, MathematicsICML
- 2012

A new convex optimization based method is developed to find the dependency structure of the observed variables of a system of linear stochastic differential equations when some of the variables are latent, and theoretically establishes a high-dimensional scaling result for structure recovery.

Learning multiple granger graphical models via group fused lasso

- Computer Science, Mathematics2015 10th Asian Control Conference (ASCC)
- 2015

This paper considers a problem of estimating multiple Granger graphical models simultaneously that share similar topology structures from a set of time series data belonging to distinct classes and proposes a fast alternating directions method of multipliers (ADMM) algorithm for solving the problem.

New Introduction to Multiple Time Series Analysis

- Computer Science
- 2007

This reference work and graduate level textbook considers a wide range of models and methods for analyzing and forecasting multiple time series. The models covered include vector autoregressive,…

High-dimensional Time Series Clustering via Cross-Predictability

- Computer ScienceAISTATS
- 2017

This paper explores a new similarity measure called “crosspredictability”: the degree to which a future value in each time series is predicted by past values of the others, and provides a theoretical proof that the proposed algorithm will identify the correct clustering structure with high probability under certain conditions.

Generalized Shrinkage Methods for Forecasting Using Many Predictors

- Mathematics
- 2012

This article provides a simple shrinkage representation that describes the operational characteristics of various forecasting methods designed for a large number of orthogonal predictors (such as…

Forecasting with Medium and Large Bayesian VARs

- Economics
- 2013

This paper is motivated by the recent interest in the use of Bayesian VARs for forecasting, even in cases where the number of dependent variables is large. In such cases, factor methods have been…

Two-step adaptive model selection for vector autoregressive processes

- Computer Science, MathematicsJ. Multivar. Anal.
- 2013

A two-step shrinkage method for VAR model selection that outperforms existing alternatives in terms of accuracy in lag order estimation, forecasting and impulse response analysis and can be implemented through a simple algorithm.

Causal Inference on Time Series using Restricted Structural Equation Models

- Computer Science, MathematicsNIPS
- 2013

This work studies a class of restricted Structural Equation Models for time series that require independent residual time series, and shows empirically that when the data are causally insufficient or the model is misspecified, the method avoids incorrect answers.