• Corpus ID: 212657585

Prediction of Bayesian Intervals for Tropical Storms

@article{Chiswick2020PredictionOB,
  title={Prediction of Bayesian Intervals for Tropical Storms},
  author={Max Chiswick and Sam Ganzfried},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.05024}
}
Building on recent research for prediction of hurricane trajectories using recurrent neural networks (RNNs), we have developed improved methods and generalized the approach to predict Bayesian intervals in addition to simple point estimates. Tropical storms are capable of causing severe damage, so accurately predicting their trajectories can bring significant benefits to cities and lives, especially as they grow more intense due to climate change effects. By implementing the Bayesian interval… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 11 REFERENCES
Predicting Hurricane Trajectories using a Recurrent Neural Network
TLDR
This work utilized the latitude, longitude, wind speed, and pressure publicly provided by the National Hurricane Center to predict the trajectory of a hurricane at 6-hour intervals and employed the RNN over a fine grid to reduce typical truncation errors.
Bayesian Recurrent Neural Network Models for Forecasting and Quantifying Uncertainty in Spatial-Temporal Data
TLDR
This work attempts to quantify uncertainty in a more formal framework while maintaining the forecast accuracy that makes these models appealing, by presenting a Bayesian RNN model for nonlinear spatio-temporal forecasting.
A Theoretically Grounded Application of Dropout in Recurrent Neural Networks
TLDR
This work applies a new variational inference based dropout technique in LSTM and GRU models, which outperforms existing techniques, and to the best of the knowledge improves on the single model state-of-the-art in language modelling with the Penn Treebank.
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
TLDR
A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.
Dropout: a simple way to prevent neural networks from overfitting
TLDR
It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
Uncertainty in Deep Learning
TLDR
This work develops tools to obtain practical uncertainty estimates in deep learning, casting recent deep learning tools as Bayesian models without changing either the models or the optimisation, and develops the theory for such tools.
Those hurricane maps don't mean what you think they mean. The New York Times
  • 2019
Global warming and hurricanes: An overview of current research results
  • 2019
SHIPS : Statistical tropical cyclone intensity forecast technique development
    Hurricane Sandy rebuilding strategy
    • 2013
    ...
    ...