Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of Time Series

@article{Tran2021RadflowAR,
  title={Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of Time Series},
  author={Alasdair Tran and A. Mathews and Cheng Soon Ong and Lexing Xie},
  journal={Proceedings of the Web Conference 2021},
  year={2021}
}
We propose a new model for networks of time series that influence each other. Graph structures among time series are found in diverse domains, such as web traffic influenced by hyperlinks, product sales influenced by recommendation, or urban transport volume influenced by road networks and weather. There has been recent progress in graph modeling and in time series forecasting, respectively, but an expressive and scalable approach for a network of series does not yet exist. We introduce Radflowโ€ฆย 

AttentionFlow: Visualising Influence in Networks of Time Series

AttentionFlow, a new system to visualise networks of time series and the dynamic influence they have on one another, and shows that attention spikes in songs can be explained by external events such as major awards, or changes in the network such as the release of a new song.

MixSeq: Connecting Macroscopic Time Series Forecasting with Microscopic Time Series Data

Inspired by the power of Seq2seq and its variants on the modeling of time series data, an end2end mixture model to cluster microscopic time series, where all the components come from a family of Sequ2seq models parameterized by different parameters is proposed.

Contextually Enhanced ES-dRNN with Dynamic Attention for Short-Term Load Forecasting

The experimental part of the work performed on 35 forecasting problems shows that the proposed model outperforms in terms of accuracy its predecessor as well as standard statistical models and state-of-the-art machine learning models.

SRI-EEG: State-Based Recurrent Imputation for EEG Artifact Correction

This work presents a novel EEG state-based imputation model built upon a recurrent neural network, which they call SRI-EEG, and demonstrates that the method achieves comparable performance to the state-of-the-art methods on the EEG artifact correction task.

References

SHOWING 1-10 OF 41 REFERENCES

EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs

This work proposes EvolveGCN, which adapts the graph convolutional network (GCN) model along the temporal dimension without resorting to node embeddings, and captures the dynamism of the graph sequence through using an RNN to evolve the GCN parameters.

AttentionFlow: Visualising Influence in Networks of Time Series

AttentionFlow, a new system to visualise networks of time series and the dynamic influence they have on one another, and shows that attention spikes in songs can be explained by external events such as major awards, or changes in the network such as the release of a new song.

Dynamic Graph Convolutional Networks

T-GCN: A Temporal Graph Convolutional Network for Traffic Prediction

A novel neural network-based traffic forecasting method, the temporal graph convolutional network (T-GCN) model, which is combined with the graph convolved network (GCN), and the gated recurrent unit (GRU) to capture the spatial and temporal dependences simultaneously.

node2vec: Scalable Feature Learning for Networks

In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.

Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of priorโ€ฆ

Prediction in Economic Networks

This is the first study of a large-scale dynamic network that shows that a product network contains useful distributed information for demand prediction, and the economic implications of algorithmically predicting demand for large numbers of products are significant.

Know-Evolve: Deep Temporal Reasoning for Dynamic Knowledge Graphs

Know-Evolve is presented, a novel deep evolutionary knowledge network that learns non-linearly evolving entity representations over time that effectively predicts occurrence or recurrence time of a fact which is novel compared to prior reasoning approaches in multi-relational setting.

N-BEATS: Neural basis expansion analysis for interpretable time series forecasting

The proposed deep neural architecture based on backward and forward residual links and a very deep stack of fully-connected layers has a number of desirable properties, being interpretable, applicable without modification to a wide array of target domains, and fast to train.

Inductive Representation Learning on Large Graphs

GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.