# Accelerating Stochastic Simulation with Interactive Neural Processes

@article{Wu2021AcceleratingSS, title={Accelerating Stochastic Simulation with Interactive Neural Processes}, author={Dongxian Wu and Matteo Chinazzi and Alessandro Vespignani and Yi-An Ma and Rose Yu}, journal={ArXiv}, year={2021}, volume={abs/2106.02770} }

Stochastic simulations such as large-scale, spatiotemporal, age-structured epidemic models are computationally expensive at fine-grained resolution. We propose Interactive Neural Process (INP), an interactive framework to continuously learn a deep learning surrogate model and accelerate simulation. Our framework is based on the novel integration of Bayesian active learning, stochastic simulation and deep sequence modeling. In particular, we develop a novel spatiotemporal neural process model to…

## References

SHOWING 1-10 OF 65 REFERENCES

Sequential Neural Processes

- Computer Science, MathematicsNeurIPS
- 2019

This paper proposes Sequential Neural Processes (SNP), which incorporates a temporal state-transition model of stochastic processes and thus extends its modeling capabilities to dynamic stochastics processes and introduces the Temporal Generative Query Networks.

Neural Processes

- Computer Science, MathematicsArXiv
- 2018

This work introduces a class of neural latent variable models which it calls Neural Processes (NPs), combining the best of both worlds: probabilistic, data-efficient and flexible, however they are also computationally intensive and thus limited in their applicability.

The Functional Neural Process

- Computer Science, MathematicsNeurIPS
- 2019

A new family of exchangeable stochastic processes, the Functional Neural Processes (FNPs), are presented and it is demonstrated that they are scalable to large datasets through mini-batch optimization and described how they can make predictions for new points via their posterior predictive distribution.

Auto-Encoding Variational Bayes

- Mathematics, Computer ScienceICLR
- 2014

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

Scalable Bayesian Optimization Using Deep Neural Networks

- Computer Science, MathematicsICML
- 2015

This work shows that performing adaptive basis function regression with a neural network as the parametric form performs competitively with state-of-the-art GP-based approaches, but scales linearly with the number of data rather than cubically, which allows for a previously intractable degree of parallelism.

Deep learning to represent subgrid processes in climate models

- Physics, Computer ScienceProceedings of the National Academy of Sciences
- 2018

A deep neural network is trained to represent all atmospheric subgrid processes in a climate model by learning from a multiscale model in which convection is treated explicitly, and the results show the feasibility of using deep learning for climate model parameterization.

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

- Mathematics, Computer ScienceICML
- 2016

A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.

Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles

- Mathematics, Computer ScienceNIPS
- 2017

This work proposes an alternative to Bayesian NNs that is simple to implement, readily parallelizable, requires very little hyperparameter tuning, and yields high quality predictive uncertainty estimates.

Inferring high-resolution human mixing patterns for disease modeling

- Medicine, BiologyNature communications
- 2021

A data-driven approach to generate effective population-level contact matrices by using highly detailed macro (census) and micro (survey) data on key socio-demographic features to model the spread of airborne infectious diseases.

A Tutorial on Bayesian Optimization

- Mathematics, Computer ScienceArXiv
- 2018

This tutorial describes how Bayesian optimization works, including Gaussian process regression and three common acquisition functions: expected improvement, entropy search, and knowledge gradient, and provides a generalization of expected improvement to noisy evaluations beyond the noise-free setting where it is more commonly applied.