• Corpus ID: 4626119

# Large-Scale Cox Process Inference using Variational Fourier Features

@article{John2018LargeScaleCP,
title={Large-Scale Cox Process Inference using Variational Fourier Features},
author={S. T. John and James Hensman},
journal={ArXiv},
year={2018},
volume={abs/1804.01016}
}
• Published 3 April 2018
• Computer Science, Mathematics
• ArXiv
Gaussian process modulated Poisson processes provide a flexible framework for modelling spatiotemporal point patterns. So far this had been restricted to one dimension, binning to a pre-determined grid, or small data sets of up to a few thousand data points. Here we introduce Cox process inference based on Fourier features. This sparse representation induces global rather than local constraints on the function space and is computationally efficient. This allows us to formulate a grid-free…
Fast Bayesian Inference for Gaussian Cox Processes via Path Integral Formulation
Gaussian Cox processes are widely-used point process models that use a Gaussian process to describe the Bayesian a priori uncertainty present in latent intensity functions. In this paper, we propose
Structured Variational Inference in Continuous Cox Process Models
• Computer Science, Mathematics
NeurIPS
• 2019
We propose a scalable framework for inference in an inhomogeneous Poisson process modeled by a continuous sigmoidal Cox process that assumes the corresponding intensity function is given by a
Fast and scalable non-parametric Bayesian inference for Poisson point processes
• Mathematics
• 2018
We study the problem of non-parametric Bayesian estimation of the intensity function of a Poisson point process. The observations are assumed to be $n$ independent realisations of a Poisson point
Efficient Inference in Multi-task Cox Process Models
• Computer Science, Mathematics
AISTATS
• 2019
This work generalizes the log Gaussian Cox process framework to model multiple correlated point data jointly and develops an efficient variational inference algorithm that is orders of magnitude faster than competing deterministic and stochastic approximations of multivariate LGCPs, coregionalization models, and multi-task permanental processes.
Generic Inference in Latent Gaussian Process Models
• Computer Science, Mathematics
J. Mach. Learn. Res.
• 2019
An automated variational method for inference in models with Gaussian process (GP) priors and general likelihoods and is scalable to large datasets by using an augmented prior via the inducing-variable approach underpinning most sparse GP approximations, along with parallel computation and stochastic optimization.
Posterior Contraction Rates for Gaussian Cox Processes with Non-identically Distributed Data
• Mathematics
• 2019
This paper considers the posterior contraction of non-parametric Bayesian inference on non-homogeneous Poisson processes. We consider the quality of inference on a rate function $\lambda$, given
Finite-dimensional Gaussian approximation with linear inequality constraints
• Mathematics, Computer Science
SIAM/ASA J. Uncertain. Quantification
• 2018
The finite-dimensional Gaussian approach from Maatouk and Bay (2017) is considered which can satisfy inequality conditions everywhere and its full framework together with a Hamiltonian Monte Carlo-based sampler provides efficient results on both data fitting and uncertainty quantification.
Gaussian Process Modulated Cox Processes under Linear Inequality Constraints
• Mathematics, Computer Science
AISTATS
• 2019
A novel finite approximation of GP-modulated Cox processes where positiveness conditions can be imposed directly on the GP, with no restrictions on the covariance function is introduced.
Know Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features
• Computer Science, Mathematics
AISTATS
• 2019
A Fourier-like generalised harmonic feature representation of the GP prior in the domain of interest is solved, which both constrains the GP and attains a low-rank representation that is used for speeding up inference.
Sensing Cox Processes via Posterior Sampling and Positive Bases
• Computer Science, Mathematics
ArXiv
• 2021
This work model the intensity function as a sample from a truncated Gaussian process, represented in a specially constructed positive basis, and shows how an minimal description positive basis can be adapted to the covariance kernel, non-stationarity and make connections to common positive bases from prior works.

## References

SHOWING 1-10 OF 20 REFERENCES
Tractable nonparametric Bayesian inference in Poisson processes with Gaussian process intensities
• Mathematics, Computer Science
ICML '09
• 2009
This paper presents the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finitedimensional proxy distributions, and uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo.
Variational Fourier Features for Gaussian Processes
• Computer Science, Mathematics
J. Mach. Learn. Res.
• 2017
This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures.
MCMC for Variationally Sparse Gaussian Processes
• Computer Science, Mathematics
NIPS
• 2015
A Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs.
Variational Inference for Gaussian Process Modulated Poisson Processes
• Computer Science, Mathematics
ICML
• 2015
This work presents the first fully variational Bayesian inference scheme for continuous Gaussian-process-modulated Poisson processes, which requires no discretisation of the domain; scales linearly in the number of observed events; and is many orders of magnitude faster than previous sampling based approaches.
Bayesian Inference and Data Augmentation Schemes for Spatial, Spatiotemporal and Multivariate Log-Gaussian Cox Processes in R
• Computer Science
• 2015
A suite of R functions provides an extensible framework for inferring covariate effects as well as the parameters of the latent field in log-Gaussian Cox processes and presents methods for Bayesian inference in two further classes of model based on the log- Gaussian Cox process.
Sparse Log Gaussian Processes via MCMC for Spatial Epidemiology
• Mathematics, Computer Science
Gaussian Processes in Practice
• 2007
The posterior inference is conducted using Markov chain Monte Carlo simulations and the sampling of the latent values is sped up by a transformation taking into account their posterior covariance.
Poisson intensity estimation with reproducing kernels
• Mathematics, Computer Science
AISTATS
• 2017
A new, computationally tractable Reproducing Kernel Hilbert Space (RKHS) formulation for the inhomogeneous Poisson process, and proves that the representer theorem does hold in an appropriately transformed RKHS, guaranteeing that the optimization of the penalized likelihood can be cast as a tractable finite-dimensional problem.
Scalable Gaussian process inference using variational methods
Various theoretical issues arising from the application of variational inference to the infinite dimensional Gaussian process setting are settled decisively and a new argument for existing approaches to variational regression that settles debate about their applicability is given.
Fast Bayesian Intensity Estimation for the Permanental Process
• Mathematics, Computer Science
ICML
• 2017
This paper presents a fast Bayesian inference scheme for the permanental process, a Cox process under which the square root of the intensity is a Gaussian process, and exploits connections with reproducing kernel Hilbert spaces to derive efficient approximate Bayes inference algorithms.
Bayesian Gaussian Process Latent Variable Model
• Computer Science, Mathematics
AISTATS
• 2010
A variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction and the maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space.