Corpus ID: 235358468

Accelerating Stochastic Simulation with Interactive Neural Processes

@article{Wu2021AcceleratingSS,
  title={Accelerating Stochastic Simulation with Interactive Neural Processes},
  author={Dongxian Wu and Matteo Chinazzi and Alessandro Vespignani and Yi-An Ma and Rose Yu},
  journal={ArXiv},
  year={2021},
  volume={abs/2106.02770}
}
Stochastic simulations such as large-scale, spatiotemporal, age-structured epidemic models are computationally expensive at fine-grained resolution. We propose Interactive Neural Process (INP), an interactive framework to continuously learn a deep learning surrogate model and accelerate simulation. Our framework is based on the novel integration of Bayesian active learning, stochastic simulation and deep sequence modeling. In particular, we develop a novel spatiotemporal neural process model to… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 65 REFERENCES
Sequential Neural Processes
TLDR
This paper proposes Sequential Neural Processes (SNP), which incorporates a temporal state-transition model of stochastic processes and thus extends its modeling capabilities to dynamic stochastics processes and introduces the Temporal Generative Query Networks. Expand
Neural Processes
TLDR
This work introduces a class of neural latent variable models which it calls Neural Processes (NPs), combining the best of both worlds: probabilistic, data-efficient and flexible, however they are also computationally intensive and thus limited in their applicability. Expand
The Functional Neural Process
TLDR
A new family of exchangeable stochastic processes, the Functional Neural Processes (FNPs), are presented and it is demonstrated that they are scalable to large datasets through mini-batch optimization and described how they can make predictions for new points via their posterior predictive distribution. Expand
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced. Expand
Scalable Bayesian Optimization Using Deep Neural Networks
TLDR
This work shows that performing adaptive basis function regression with a neural network as the parametric form performs competitively with state-of-the-art GP-based approaches, but scales linearly with the number of data rather than cubically, which allows for a previously intractable degree of parallelism. Expand
Deep learning to represent subgrid processes in climate models
TLDR
A deep neural network is trained to represent all atmospheric subgrid processes in a climate model by learning from a multiscale model in which convection is treated explicitly, and the results show the feasibility of using deep learning for climate model parameterization. Expand
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
TLDR
A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy. Expand
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
TLDR
This work proposes an alternative to Bayesian NNs that is simple to implement, readily parallelizable, requires very little hyperparameter tuning, and yields high quality predictive uncertainty estimates. Expand
Inferring high-resolution human mixing patterns for disease modeling
TLDR
A data-driven approach to generate effective population-level contact matrices by using highly detailed macro (census) and micro (survey) data on key socio-demographic features to model the spread of airborne infectious diseases. Expand
A Tutorial on Bayesian Optimization
TLDR
This tutorial describes how Bayesian optimization works, including Gaussian process regression and three common acquisition functions: expected improvement, entropy search, and knowledge gradient, and provides a generalization of expected improvement to noisy evaluations beyond the noise-free setting where it is more commonly applied. Expand
...
1
2
3
4
5
...