Corpus ID: 210861210

Deep Transformer Models for Time Series Forecasting: The Influenza Prevalence Case

@article{Wu2020DeepTM,
  title={Deep Transformer Models for Time Series Forecasting: The Influenza Prevalence Case},
  author={Neo Wu and Bradley Green and Xue Ben and Shawn O’Banion},
  journal={ArXiv},
  year={2020},
  volume={abs/2001.08317}
}
In this paper, we present a new approach to time series forecasting. Time series data are prevalent in many scientific and engineering disciplines. Time series forecasting is a crucial task in modeling time series data, and is an important area of machine learning. In this work we developed a novel method that employs Transformer-based machine learning models to forecast time series data. This approach works by leveraging self-attention mechanisms to learn complex patterns and dynamics from… Expand
A Transformer Self-attention Model for Time Series Forecasting
TLDR
A transformer neural network based on the self-attention is presented that has special capability in forecasting time series problems and was found that with better configuration of the network and better adjustment of attention, it was possible to obtain more desirable results in any specific problem. Expand
NAST: Non-Autoregressive Spatial-Temporal Transformer for Time Series Forecasting
TLDR
This work is the first attempt to propose a NonAutoregressive Transformer architecture for time series forecasting, aiming at overcoming the time delay and accumulative error issues in the canonical Transformer. Expand
Long-range forecasting in feature-evolving data streams
TLDR
This paper proposes the OFAT algorithm, which is a stochastic deep neural network framework to address stated problems collectively, and demonstrates that OFAT is fast, robust, accurate and superior to the state-of-the-art methods. Expand
The predictive skill of convolutional neural networks models for disease forecasting
TLDR
CNNs and RNNs bring the power of nonlinear transformations to purely data-driven epidemiological models, a capability that heretofore has been limited to more elaborate mechanistic/compartmental disease models. Expand
Deep Autoregressive Models with Spectral Attention
TLDR
A forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module, which merges global and local frequency domain information in the model’s embedded space, which can identify global trends and seasonality patterns and produce interpretable results that improve forecasting accuracy. Expand
Demand Forecasting Intermittent and Lumpy Time Series: Comparing Statistical, Machine Learning and Deep Learning Methods
TLDR
This study compares methods from statistics, machine learning and deep learning by applying a novel metric called Stock-keeping-oriented Prediction Error Costs (SPEC), which overcomes the drawbacks associated with traditional metrics. Expand
AutoODE: Bridging Physics-based and Data-driven modeling for COVID-19 Forecasting
  • Rui Wang
  • 2020
As COVID-19 continues to spread, accurately forecasting the number of newly infected, removed and death cases has become a crucial task in public health. While mechanics compartment models areExpand
Predictive Skill of Deep Learning Models Trained on Limited Sequence Data.
TLDR
CNNs and RNNs bring the power of nonlinear transformations to purely data-driven epidemiological models, a capability that heretofore has been limited to more elaborate mechanistic/compartmental disease models. Expand
Analysis and modeling to forecast in time series: a systematic review
TLDR
This review aims to offer a structured and comprehensive view of the full process flow, and encompasses time series decomposition, stationary tests, modeling and forecasting, and spans from well-established conventional approaches to more recent adaptations of deep learning to time series forecasting. Expand
Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of Time Series
TLDR
Radflow is introduced, a novel model that embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 22 REFERENCES
A novel data-driven model for real-time influenza forecasting
TLDR
The proposed data-driven machine learning methods that are capable of making real-time influenza forecasts that integrate the impacts of climatic factors and geographical proximity to achieve better forecasting performance outperforms existing known influenza forecasting methods. Expand
Attention-based recurrent neural network for influenza epidemic prediction
TLDR
A new deep neural network structure is proposed that forecasts a real-time influenza-like illness rate (ILI%) in Guangzhou, China using long short-term memory (LSTM) neural networks to precisely forecast accurateness due to the long-term attribute and diversity of influenza epidemic data. Expand
LSTM Recurrent Neural Networks for Influenza Trends Prediction
TLDR
This paper is the first one to use multiple and novel data sources including virologic surveillance, influenza geographic spread, Google trends, climate and air pollution to predict influenza trends and finds there are several environmental and climatic factors have the significant correlation with ILI rate. Expand
Twitter Improves Influenza Forecasting
TLDR
It is shown that data from the microblogging community Twitter significantly improves influenza forecasting, and models incorporating data derived from Twitter can reduce forecasting error by 17-30% over a baseline that only uses historical data. Expand
Sequence to Sequence with Attention for Influenza Prevalence Prediction using Google Trends
TLDR
The sequence to sequence (Seq2Seq) with attention model with Google Trends data is investigated to assess and predict the number of influenza-infected people over the course of multiple weeks and achieves state-of-the art results. Expand
Time Series Analysis by State Space Methods
TLDR
This excellent text provides a comprehensive treatment of the state space approach to time series analysis, where observations are regarded as made up of distinct components such as trend, seasonal, regression elements and disturbence terms, each of which is modelled separately. Expand
Improved state-level influenza nowcasting in the United States leveraging Internet-based data and network approaches
TLDR
An approach using Google search and EHR data with an approach leveraging spatiotemporal synchronicities of influenza activity across states to improve state-level influenza activity estimates in the US is introduced. Expand
Long Short-Term Memory
TLDR
A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Expand
Accurate estimation of influenza epidemics using Google search data via ARGO
TLDR
An influenza tracking model, ARGO (AutoRegression with GOogle search data), that uses publicly available online search data that outperforms all available Google-search–based real-time tracking models for influenza epidemics at the national level of the United States, including Google Flu Trends. Expand
Attention is All you Need
TLDR
A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data. Expand
...
1
2
3
...