# Deep and Confident Prediction for Time Series at Uber

@article{Zhu2017DeepAC, title={Deep and Confident Prediction for Time Series at Uber}, author={Lingxue Zhu and Nikolay Pavlovich Laptev}, journal={2017 IEEE International Conference on Data Mining Workshops (ICDMW)}, year={2017}, pages={103-110} }

Reliable uncertainty estimation for time series prediction is critical in many fields, including physics, biology, and manufacturing. At Uber, probabilistic time series forecasting is used for robust prediction of number of trips during special events, driver incentive allocation, as well as real-time anomaly detection across millions of metrics. Classical time series models are often used in conjunction with a probabilistic formulation for uncertainty estimation. However, such models are hard…

## 257 Citations

### MOrdReD: Memory-based Ordinal Regression Deep Neural Networks for Time Series Forecasting

- Computer ScienceArXiv
- 2018

This work proposes a novel, end-to-end deep learning method for time series forecasting that provides an excellent predictive forecast, shadowing true future values, but also allows to infer valuable information, such as the predictive distribution of the occurrence of critical events of interest, accurately and reliably even over long time horizons.

### DeepPIPE: A distribution-free uncertainty quantification approach for time series forecasting

- Computer ScienceNeurocomputing
- 2020

### Conformal Time-series Forecasting

- Computer ScienceNeurIPS
- 2021

This work extends the inductive conformal prediction framework to the time-series forecasting setup, and proposes a lightweight uncertainty estimation procedure to address the above limitations and provides uncertainty intervals with theoretical guarantees on frequentist coverage for any multi-horizon forecast predictor and any dataset.

### Quantifying Uncertainty in Deep Spatiotemporal Forecasting

- Computer ScienceKDD
- 2021

This paper describes two types of spatiotemporal forecasting problems: regular grid-based and graph-based, and analyzes UQ methods from both the Bayesian and the frequentist point of view, casting in a unified framework via statistical decision theory.

### Probabilistic forecasting for energy time series considering uncertainties based on deep learning algorithms

- Computer ScienceElectric Power Systems Research
- 2021

### Deep Extreme Mixture Model for Time Series Forecasting

- Computer ScienceCIKM
- 2022

This work develops a novel Deep eXtreme Mixture Model for univariate time series forecasting, which addresses extreme events in time series and is comparable with the existing baseline methods for normal time step forecasting.

### Applying SVGD to Bayesian Neural Networks for Cyclical Time-Series Prediction and Inference

- Computer ScienceArXiv
- 2019

A regression-based BNN model is proposed to predict spatiotemporal quantities like hourly rider demand with calibrated uncertainties, capable of producing time series predictions as well as measures of uncertainty surrounding the predictions.

### Fast Memory-efficient Extreme Events Prediction in Complex Time series

- Computer Science
- 2020

This paper proposes a generic memory-efficient framework for realtime stochastic extreme events prediction in complex time series systems such as intrusion detection, Internet of Things (IoT), social…

### Addressing model uncertainty in probabilistic forecasting using Monte Carlo dropout

- Computer Science
- 2020

This work proposes addressing the model uncertainty problem using Monte Carlo dropout, a variational approach that assigns distributions to the weights of a neural network instead of simply using fixed values.

### Uncertainty Quantification for Traffic Forecasting: A Unified Approach

- Computer ScienceArXiv
- 2022

Deep Spatio-Temporal Uncertainty Quantiﬁcation (DeepSTUQ), which can estimate both aleatoric and epistemic uncertainty and combines the merits of variational inference and deep ensembling by integrating the Monte Carlo dropout and the Adaptive Weight Averaging re-training methods.

## References

SHOWING 1-10 OF 23 REFERENCES

### Time-series Extreme Event Forecasting with Neural Networks at Uber

- Computer Science
- 2017

A novel endto-end recurrent neural network architecture is proposed that outperforms the current state of the art event forecasting methods on Uber data and generalizes well to a public M3 dataset used for time-series forecasting competitions.

### Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

- Computer ScienceICML
- 2016

A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.

### Concrete Dropout

- Computer ScienceNIPS
- 2017

This work proposes a new dropout variant which gives improved performance and better calibrated uncertainties, and uses a continuous relaxation of dropout’s discrete masks to allow for automatic tuning of the dropout probability in large models, and as a result faster experimentation cycles.

### Long Short-Term Memory

- Computer ScienceNeural Computation
- 1997

A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.

### Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks

- Computer ScienceICML
- 2015

This work presents a novel scalable method for learning Bayesian neural networks, called probabilistic backpropagation (PBP), which works by computing a forward propagation of probabilities through the network and then doing a backward computation of gradients.

### A new boosting algorithm for improved time-series forecasting with recurrent neural networks

- Computer ScienceInf. Fusion
- 2008

### Bayesian Recurrent Neural Networks

- Computer ScienceArXiv
- 2017

This work shows that a simple adaptation of truncated backpropagation through time can yield good quality uncertainty estimates and superior regularisation at only a small extra computational cost during training, and demonstrates how a novel kind of posterior approximation yields further improvements to the performance of Bayesian RNNs.

### A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

- Computer ScienceNIPS
- 2016

This work applies a new variational inference based dropout technique in LSTM and GRU models, which outperforms existing techniques, and to the best of the knowledge improves on the single model state-of-the-art in language modelling with the Penn Treebank.

### Nonlinear Systems Identification Using Deep Dynamic Neural Networks

- Computer ScienceArXiv
- 2016

It is demonstrated that deep neural networks are effective model estimators from input-output data and associated characteristics of the underlying dynamical systems.

### Auto-Encoding Variational Bayes

- Computer ScienceICLR
- 2014

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.