Going off grid: computationally efficient inference for log-Gaussian Cox processes

@article{Simpson2011GoingOG,
  title={Going off grid: computationally efficient inference for log-Gaussian Cox processes},
  author={Daniel P. Simpson and Janine B. Illian and Finn Lindgren and Sigrunn H. S{\o}rbye and H. Rue},
  journal={Biometrika},
  year={2011},
  volume={103},
  pages={49-70}
}
This paper introduces a new method for performing computational inference on log-Gaussian Cox processes. The likelihood is approximated directly by making use of a continuously specified Gaussian random field. We show that for sufficiently smooth Gaussian random field prior distributions, the approximation can converge with arbitrarily high order, whereas an approximation based on a counting process on a partition of the domain achieves only first-order convergence. The results improve upon the… 

Figures from this paper

Inference for log Gaussian Cox processes using an approximate marginal posterior

TLDR
This paper estimates an approximate marginal posterior for parameters of log Gaussian Cox processes and proposes comprehensive model inference strategy, based on a pseudo-marginal Markov chain Monte Carlo approach.

Scalable inference for space‐time Gaussian Cox processes

TLDR
This article addresses computational bottlenecks by combining two recent developments: (i) a data augmentation strategy that has been proposed for space‐time Gaussian Cox processes that is based on exact Bayesian inference and does not require fine grid approximations for infinite dimensional integrals, and (ii) a recently developed family of sparsity‐inducing Gaussian processes, called nearest‐neighbor Gaussia processes, to avoid expensive matrix computations.

INLA or MCMC? A tutorial and comparative evaluation for spatial prediction in log-Gaussian Cox processes

TLDR
The results question the notion that the latter technique is both significantly faster and more robust than MCMC in this setting; 100,000 iterations of the MALA algorithm running in 20 min on a desktop PC delivered greater predictive accuracy than the default INLA strategy and gave comparative performance to the full Laplace approximation which ran in 39 min.

Exact Bayesian inference in spatiotemporal Cox processes driven by multivariate Gaussian processes

TLDR
A novel inference methodology to perform Bayesian inference for spatiotemporal Cox processes where the intensity function depends on a multivariate Gaussian process that samples from the joint posterior distribution of the parameters and latent variables of the model.

Sparse Approximate Inference for Spatio-Temporal Point Process Models

TLDR
The proposed algorithms provide a flexible and faster alternative to both nonlinear filtering-smoothing type algorithms and to approaches that implement the Laplace method or expectation propagation on (block) sparse latent Gaussian models.

Exact Bayesian inference for level-set Cox processes with piecewise constant intensity function

TLDR
A class of multidimensional Cox processes in which the intensity function is piecewise constant is proposed and a methodology to perform Bayesian inference without the need to resort to discretisation-based approximations is developed.

Investigating mesh‐based approximation methods for the normalization constant in the log Gaussian Cox process likelihood

TLDR
This work carefully describes several alternative variants of that approximate integration method and derives an analytical formula for the integral in question, which actually is exact under the triangular mesh assumption used by SPDE‐INLA.

Fast, Scalable Approximations to Posterior Distributions in Extended Latent Gaussian Models

TLDR
A fast, scalable approximate Bayesian inference methodology for this novel class of additive models, called Extended Latent Gaussian Models, that allow for a wide range of response distributions and flexible relationships between the additive predictor and mean response is developed.

Level set Cox processes

Spatial Models with the Integrated Nested Laplace Approximation within Markov Chain Monte Carlo

TLDR
This paper describes how to use INLA within the Metropolis-Hastings algorithm to fit spatial models and estimate the joint posterior distribution of a reduced number of parameters.
...

References

SHOWING 1-10 OF 63 REFERENCES

An explicit link between Gaussian fields and Gaussian Markov random fields: the stochastic partial differential equation approach

TLDR
It is shown that, using an approximate stochastic weak solution to (linear) stochastically partial differential equations, some Gaussian fields in the Matérn class can provide an explicit link, for any triangulation of , between GFs and GMRFs, formulated as a basis function representation.

Practical Maximum Pseudolikelihood for Spatial Point Patterns

This paper describes a technique for computing approximate maximum pseudolikelihood estimates of the parameters of a spatial point process. The method is an extension of Berman & Turner's (1992)

Exploring a New Class of Non-stationary Spatial Gaussian Random Fields with Varying Local Anisotropy

TLDR
The results show that the use of an SPDE with non-constant coefficients is a promising way of creating non-stationary spatial GMRFs that allow for physical interpretability of the parameters, although there are several remaining challenges that would need to be solved before these models can be put to general practical use.

Convergence of posteriors for discretized log Gaussian Cox processes

Approximate Bayesian Inference for Latent Gaussian Models

TLDR
The approximation tool for latent GMRF models is introduced and the approximation for the posterior of the hyperparameters θ in equation (1) is shown to give extremely accurate results in a fraction of the computing time used by MCMC algorithms.

Riemann manifold Langevin and Hamiltonian Monte Carlo methods

TLDR
The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.

In order to make spatial statistics computationally feasible, we need to forget about the covariance function

TLDR
This paper compares two approximations to GRFs with Matérn covariance functions: the kernel convolution approximation and the Gaussian Markov random field representation of an associated stochastic partial differential equation.

Gaussian predictive process models for large spatial data sets

TLDR
This work achieves the flexibility to accommodate non‐stationary, non‐Gaussian, possibly multivariate, possibly spatiotemporal processes in the context of large data sets in the form of a computational template encompassing these diverse settings.

Stationary Process Approximation for the Analysis of Large Spatial Datasets

TLDR
This work proposes a finite sum process approximation model which is conceptually simple and routine to implement and gives real data examples to illustrate the method.

Fast approximate inference with INLA: the past, the present and the future

TLDR
This talk will outline the Integrated Nested Laplace Approximation method and its related R package, and focus on using INLA for survival and point process models and demonstrate some of the new features.
...