• Corpus ID: 244729652

NeuralProphet: Explainable Forecasting at Scale

@article{Triebe2021NeuralProphetEF,
  title={NeuralProphet: Explainable Forecasting at Scale},
  author={Oskar Triebe and Hansika Hewamalage and Polina Pilyugina and Nikolay Pavlovich Laptev and Christoph Bergmeir and Ram Rajagopal},
  journal={ArXiv},
  year={2021},
  volume={abs/2111.15397}
}
We introduce NeuralProphet, a successor to Facebook Prophet, which set an industry standard for explainable, scalable, and user-friendly forecasting frameworks. With the proliferation of time series data, explainable forecasting remains a challenging task for business and operational decision making. Hybrid solutions are needed to bridge the gap between interpretable classical methods and scalable deep learning models. We view Prophet as a precursor to such a solution. However, Prophet lacks… 

Learning Fast and Slow for Online Time Series Forecasting

Fast and Slow learning Networks (FSNet) is proposed, a holistic framework for online time-series forecasting to simultaneously deal with abrupt changing and repeating patterns and improves the slowly-learned backbone by dynamically balancing fast adaptation to recent changes and retrieving similar old knowledge.

Forecasting Future World Events with Neural Networks

Autocast is introduced, a dataset containing thousands of forecasting questions and an accompanying news corpus that poses a novel challenge for large language models and improved performance could bring large practical benefits.

Long-Term Missing Value Imputation for Time Series Data Using Deep Neural Networks

This work presents an approach that uses a deep learning model, in particular, a MultiLayer Perceptron (MLP), for estimating the missing values of a variable in multivariate time series data and indicates that using an MLP for filling a large gap leads to better results, especially when the data behave nonlinearly.

Real-Time Massive MIMO Channel Prediction: A Combination of Deep Learning and NeuralProphet

This paper uses NeuralProphet (NP), a recently introduced time-series model, composed of statistical components, e.g., auto-regression (AR) and Fourier terms, for CSI prediction, and develops a novel hybrid framework comprising RNN and NP to achieve better prediction accuracy.

Short - Term Traffic Prediction Using Fb-PROPHET and Neural-PROPHET

STTP models were developed to predict traffic volume using Fb-PROPHET and Neural-PRopHET to benefit traffic management agencies for proper planning and assigning of routes to avoid congestion.

Faster and Cheaper Energy Demand Forecasting at Scale

  • Computer Science
  • 2022
Transplit, a new lightweight transformer-based model, is introduced, which significantly decreases this cost by exploiting the seasonality property and learning typical days of power demand.

Forecasting of Electric Load Using a Hybrid LSTM-Neural Prophet Model

Load forecasting (LF) is an essential factor in power system management. LF helps the utility maximize the utilization of power-generating plants and schedule them both reliably and economically. In

Experimental study of time series forecasting methods for groundwater level prediction

. Groundwater level prediction is an applied time series forecasting task with important social impacts to optimize water management as well as preventing some natural disasters: for instance, floods

Homicide forecasting for the state of Guanajuato using LSTM and geospatial information

The results show that the phenomenon is related to the spatial context and encourage the use of geospatial information in forecasting models.

Impact of Clustering Methods on Machine Learning-based Solar Power Prediction Models

It is demonstrated that proper tuning of thresholds for the clearness index improves prediction accuracy by 20.19% but results in worse performance than using K-means with all weather features as input to the clustering, where both euclidian distance and dynamic time-warping are used.

References

SHOWING 1-10 OF 24 REFERENCES

AR-Net: A simple Auto-Regressive Neural Network for time-series

A new framework for time-series modeling that combines the best of traditional statistical models and neural networks is presented, and it is shown that AR-Net is as interpretable as Classic-AR but also scales to long-range dependencies.

Cyclical Learning Rates for Training Neural Networks

  • L. Smith
  • Computer Science
    2017 IEEE Winter Conference on Applications of Computer Vision (WACV)
  • 2017
A new method for setting the learning rate, named cyclical learning rates, is described, which practically eliminates the need to experimentally find the best values and schedule for the global learning rates.

Forecasting at Scale

A practical approach to forecasting “at scale” that combines configurable models with analyst-in-the-loop performance analysis, and a modular regression model with interpretable parameters that can be intuitively adjusted by analysts with domain knowledge about the time series are described.

Super-convergence: very fast training of neural networks using large learning rates

A phenomenon is described, where neural networks can be trained an order of magnitude faster than with standard training methods, and it is shown that super-convergence provides a greater boost in performance relative to standard training when the amount of labeled training data is limited.

Decoupled Weight Decay Regularization

This work proposes a simple modification to recover the original formulation of weight decay regularization by decoupling the weight decay from the optimization steps taken w.r.t. the loss function, and provides empirical evidence that this modification substantially improves Adam's generalization performance.

Monash Time Series Forecasting Archive

This paper presents a comprehensive forecasting archive containing 25 publicly available time series datasets from varied domains, with different characteristics in terms of frequency, series lengths, and inclusion of missing values, for the benefit of researchers using the archive to benchmark their forecasting algorithms.

Statistical and Machine Learning forecasting methods: Concerns and ways forward

It is found that the post-sample accuracy of popular ML methods are dominated across both accuracy measures used and for all forecasting horizons examined, and that their computational requirements are considerably greater than those of statistical methods.

PyTorch: An Imperative Style, High-Performance Deep Learning Library

This paper details the principles that drove the implementation of PyTorch and how they are reflected in its architecture, and explains how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance.

Out-of-sample tests of forecasting accuracy: an analysis and review

Structural Time Series Models

1 Trend and Cycle Decomposition y t = t + t where y t is an n 1 vector and t and t represent trend and cycle components respectively. This decomposition into components is not unique. Beveridge and