• Corpus ID: 152282824

Modeling Combinatorial Evolution in Time Series Prediction

@article{Hu2019ModelingCE,
  title={Modeling Combinatorial Evolution in Time Series Prediction},
  author={Wenjie Hu and Yang Yang and Zilong You and Zongtao Liu and Xiang Ren},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.05006}
}
Time series modeling aims to capture the intrinsic factors underpinning observed data and its evolution. However, most existing studies ignore the evolutionary relations among these factors, which are what cause the combinatorial evolution of a given time series. In this paper, we propose to represent time-varying relations among intrinsic factors of time series data by means of an evolutionary state graph structure. Accordingly, we propose the Evolutionary Graph Recurrent Networks (EGRN) to… 

Figures and Tables from this paper

Markovian RNN: An Adaptive Time Series Prediction Network with HMM-based Switching for Nonstationary Environments
TLDR
This work introduces a novel recurrent neural network (RNN) architecture, which adaptively switches between internal regimes in a Markovian way to model the nonstationary nature of the given data.

References

SHOWING 1-10 OF 48 REFERENCES
Gated Graph Sequence Neural Networks
TLDR
This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.
Dynamic Network Embedding by Modeling Triadic Closure Process
TLDR
This paper presents a novel representation learning approach, DynamicTriad, to preserve both structural information and evolution patterns of a given network and can effectively be applied and help to identify telephone frauds in a mobile network, and to predict whether a user will repay her loans or not in a loan network.
Multilevel Wavelet Decomposition Network for Interpretable Time Series Analysis
TLDR
A wavelet-based neural network structure called multilevel Wavelet Decomposition Network (mWDN) is proposed for building frequency-aware deep learning models for time series analysis and an importance analysis method is proposed to identify those time-series elements and mWDN layers that are crucially important to time seriesAnalysis.
Time series representation and similarity based on local autopatterns
TLDR
This work introduces a novel approach to model the dependency structure in time series that generalizes the concept of autoregression to local autopatterns and generates a pattern-based representation along with a similarity measure called learned pattern similarity (LPS).
Time Series Classification Using Multi-Channels Deep Convolutional Neural Networks
TLDR
A novel deep learning framework for multivariate time series classification is proposed that is not only more efficient than the state of the art but also competitive in accuracy and demonstrates that feature learning is worth to investigate for time series Classification.
Autoregressive Convolutional Neural Networks for Asynchronous Time Series
TLDR
The model is inspired by standard autoregressive models and gating mechanisms used in recurrent neural networks and involves an AR-like weighting system, where the final predictor is obtained as a weighted sum of adjusted regressors, while the weights are datadependent functions learnt through a convolutional network.
NerveNet: Learning Structured Policy with Graph Neural Networks
TLDR
NerveNet is proposed to explicitly model the structure of an agent, which naturally takes the form of a graph, and is demonstrated to be significantly more transferable and generalizable than policies learned by other models and are able to transfer even in a zero-shot setting.
Computational Capabilities of Graph Neural Networks
TLDR
The functions that can be approximated by GNNs, in probability, up to any prescribed degree of precision are described, and includes most of the practically useful functions on graphs.
Time-Series Classification with COTE: The Collective of Transformation-Based Ensembles
TLDR
Through extensive experimentation on 72 datasets, it is demonstrated that the simple collective formed by including all classifiers in one ensemble is significantly more accurate than any of its components and any other previously published TSC algorithm.
High-Order Graph Convolutional Recurrent Neural Network: A Deep Learning Framework for Network-Scale Traffic Learning and Forecasting
TLDR
A novel deep learning framework is proposed, Traffic Graph Convolutional Long Short-Term Memory Neural Network (TGC-LSTM), to learn the interactions between roadways in the traffic network and forecast the network-wide traffic state.
...
...