Estimating Stochastic Poisson Intensities Using Deep Latent Models

@article{Wang2020EstimatingSP,
  title={Estimating Stochastic Poisson Intensities Using Deep Latent Models},
  author={Ruixin Wang and Prateek Jaiswal and Harsha Honnappa},
  journal={2020 Winter Simulation Conference (WSC)},
  year={2020},
  pages={596-607}
}
We present a new method for estimating the stochastic intensity of a doubly stochastic Poisson process. Statistical and theoretical analyses of traffic traces show that these processes are appropriate models of high intensity traffic arriving at an array of service systems. The statistical estimation of the underlying latent stochastic intensity process driving the traffic model involves a rather complicated nonlinear filtering problem. We develop a novel simulation method, using deep neural… 

Figures and Tables from this paper

CALIBRATING INFINITE SERVER QUEUEING MODELS DRIVEN BY COX PROCESSES

This paper studies the problem of calibrating a Cox /G/ ∞ infinite server queue to a dataset consisting of the number in the system and the age of the jobs currently in service, sampled at discrete

Doubly Stochastic Generative Arrivals Modeling

We propose a new framework named DS-WGAN that integrates the doubly stochastic (DS) structure and the Wasserstein generative adversarial networks (WGAN) to model, estimate, and simulate a wide class

Calibrating Infinite Server Queueing Models Driven By Cox Processes

An approximate inference procedure is derived that maximizes a lower bound to the FDDs using stochastic gradient descent and is tight when the calibrated parameters coincide with those of the ‘true’ model.

V ARIATIONAL I NFERENCE FOR D IFFUSION MODU LATED C OX P ROCESSES

  • Computer Science
  • 2020
This paper parametrize the class of approximate smoothing posteriors using a neural network, derive a lower bound on the evidence of the observed point process sample-path, and optimize the lower bound using stochastic gradient descent (SGD).

References

SHOWING 1-10 OF 31 REFERENCES

Neural Stochastic Differential Equations: Deep Latent Gaussian Models in the Diffusion Limit

This work develops a variational inference framework for deep latent Gaussian models via stochastic automatic differentiation in Wiener space, where the variational approximations to the posterior are obtained by Girsanov (mean-shift) transformation of the standard Wiener process and the computation of gradients is based on the theory of Stochastic flows.

Theoretical guarantees for sampling and inference in generative models with latent diffusions

It is shown that one can efficiently sample from a wide class of terminal target distributions by choosing the drift of the latent diffusion from the class of multilayer feedforward neural nets, with the accuracy of sampling measured by the Kullback-Leibler divergence to the target distribution.

The modeling of randomly modulated jump processes

It turns out that this martingale model is very similar to the well-known signal-in-additive-white-Gaussian-noise model, so that it can be used conveniently in solving problems of detection of signals in jump processes and estimation of signals from jump processes.

FITTING CONTINUOUS PIECEWISE LINEAR POISSON INTENSITIES VIA MAXIMUM LIKELIHOOD AND LEAST SQUARES

We investigate maximum likelihood (ML) and ordinary least squares (OLS) methods to fit a continuous piecewise linear (PL) intensity function for non-homogeneous Poisson processes. The estimation

Scaling and modeling of call center arrivals

The overdispersion in the context of “heavy traffic” is studied and a critical factor that characterizes the stochastic variability of the arrivals to their averages is identified as the scaling parameter.

Scalable Gradients for Stochastic Differential Equations

The adjoint sensitivity method scalably computes gradients of solutions to ordinary differential equations. We generalize this method to stochastic differential equations, allowing time-efficient and

Modeling Daily Arrivals to a Telephone Call Center

Stochastic models of time-dependent arrivals are developed, with focus on the application to call centers, including the essential features of the arrival process, the goodness of fit of the estimated models, and the sensitivity of various simulated performance measures of the call center to the choice of arrival process model.

Variational Inference: A Review for Statisticians

Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.

Managing uncertainty in call centres using Poisson mixtures

We model a call centre as a queueing model with Poisson arrivals having an unknown varying arrival rate. We show how to compute prediction intervals for the arrival rate, and use the Erlang formula

Filtering, Stability, and Robustness

The theory of nonlinear filtering concerns the optimal estimation of a Markov signal in noisy observations. Such estimates necessarily depend on the model that is chosen for the signal and