• Corpus ID: 150373780

Think Globally, Act Locally: A Deep Neural Network Approach to High-Dimensional Time Series Forecasting

  title={Think Globally, Act Locally: A Deep Neural Network Approach to High-Dimensional Time Series Forecasting},
  author={Rajat Sen and Hsiang-Fu Yu and Inderjit S. Dhillon},
Forecasting high-dimensional time series plays a crucial role in many applications such as demand forecasting and financial predictions. [] Key Method In particular, DeepGLO is a hybrid model that combines a global matrix factorization model regularized by a temporal deep network with a local deep temporal model that captures patterns specific to each dimension. The global and local models are combined via a data-driven attention mechanism for each dimension.

Figures and Tables from this paper

Learning from Multiple Time Series: A Deep Disentangled Approach to Diversified Time Series Forecasting
An adaptive parameter generation module enhanced by the contrastive multihorizon coding (CMC) is proposed to generate the parameters of the local feature encoder for each individual time series, which maximizes the mutual information between the series-specific context variable and the long/short-term representations of the corresponding time series.
Meta-Forecasting by combining Global Deep Representations with Local Adaptation
This work introduces a novel forecasting method, called Meta GlobalLocal Auto-Regression (Meta-GLAR), that adapts to each time series by learning in closed-form the mapping from the representations produced by a recurrent neural network (RNN) to one-step-ahead forecasts.
Do We Really Need Deep Learning Models for Time Series Forecasting?
The results demonstrate that the window-based input transformation boosts the performance of a simple GBRT model to levels that outperform all state-of-the-art DNN models evaluated in this paper.
Temporal Latent Auto-Encoder: A Method for Probabilistic Multivariate Time Series Forecasting
A novel temporal latent auto-encoder method which enables nonlinear factorization of multivariate time series, learned end-to-end with a temporal deep learning latent space forecast model, and achieves state-of-theart performance on many popular multivariate datasets.
Hierarchically Regularized Deep Forecasting
This paper proposes a new approach for hierarchical forecasting based on decomposing the time series along a global set of basis time series and modeling hierarchical constraints using the coefficients of the basis decomposition for each time series.
Parameter Efficient Deep Probabilistic Forecasting
The Effectiveness of Discretization in Forecasting: An Empirical Study on Neural Time Series Models
It is found that binning almost always improves performance (compared to using normalized real-valued inputs), but that the particular type of binning chosen is of lesser importance.
GLIMA: Global and Local Time Series Imputation with Multi-directional Attention Learning
An imputation framework to learn both global and local dependencies of multivariate time series, as well as a multi-dimensional self-attention to learn capture distant correlations across both time and feature is proposed.
Spectral Temporal Graph Neural Network for Multivariate Time-series Forecasting
Spectral Temporal Graph Neural Network (StemGNN) is proposed to further improve the accuracy of multivariate time-series forecasting and learns inter-series correlations automatically from the data without using pre-defined priors.
ForecastNet: A Time-Variant Deep Feed-Forward Neural Network Architecture for Multi-Step-Ahead Time-Series Forecasting
It is argued that time-invariance can reduce the capacity to perform multi-step-ahead forecasting, where modelling the dynamics at a range of scales and resolutions is required.


Deep Factors for Forecasting
A hybrid model that incorporates the benefits of both classical and deep neural networks is proposed, which is data-driven and scalable via a latent, global, deep component, and handles uncertainty through a local classical model.
Conditional Time Series Forecasting with Convolutional Neural Networks
This paper compares the performance of the WaveNet model to a state-of-the-art fully convolutional network (FCN), and an autoregressive model popular in econometrics and shows that the model is much better able to learn important dependencies in between financial time series resulting in a more robust and accurate forecast.
Temporal Regularized Matrix Factorization for High-dimensional Time Series Prediction
This paper develops novel regularization schemes and uses scalable matrix factorization methods that are eminently suited for high-dimensional time series data that has many missing values, and makes interesting connections to graph regularization methods in the context of learning the dependencies in an autoregressive framework.
Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks
A novel deep learning framework, namely Long- and Short-term Time-series network (LSTNet), to address this open challenge of multivariate time series forecasting, using the Convolution Neural Network and the Recurrent Neural Network to extract short-term local dependency patterns among variables and to discover long-term patterns for time series trends.
Deep State Space Models for Time Series Forecasting
A novel approach to probabilistic time series forecasting that combines state space models with deep learning by parametrizing a per-time-series linear state space model with a jointly-learned recurrent neural network, which compares favorably to the state-of-the-art.
A Multi-Horizon Quantile Recurrent Forecaster
We propose a framework for general probabilistic multi-step time series regression. Specifically, we exploit the expressiveness and temporal nature of Recurrent Neural Networks, the nonparametric
Spatio-Temporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting
This paper proposes a novel deep learning framework, Spatio-Temporal Graph Convolutional Networks (STGCN), to tackle the time series prediction problem in traffic domain, and builds the model with complete convolutional structures, which enable much faster training speed with fewer parameters.
An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
A systematic evaluation of generic convolutional and recurrent architectures for sequence modeling concludes that the common association between sequence modeling and recurrent networks should be reconsidered, and convolutionals should be regarded as a natural starting point for sequence modeled tasks.
GluonTS: Probabilistic Time Series Models in Python
Gluon Time Series is introduced, a library for deep-learning-based time series modeling that provides all necessary components and tools that scientists need for quickly building new models, for efficiently running and analyzing experiments and for evaluating model accuracy.