• Corpus ID: 88518414

Gaussian Probabilities and Expectation Propagation

  title={Gaussian Probabilities and Expectation Propagation},
  author={John P. Cunningham and Philipp Hennig and Simon Lacoste-Julien},
  journal={arXiv: Machine Learning},
While Gaussian probability densities are omnipresent in applied mathematics, Gaussian cumulative probabilities are hard to calculate in any but the univariate case. We study the utility of Expectation Propagation (EP) as an approximate integration method for this problem. For rectangular integration regions, the approximation is highly accurate. We also extend the derivations to the more general case of polyhedral integration regions. However, we find that in this polyhedral case, EP's answer… 

Figures from this paper

Filter-Based Abstractions for Safe Planning of Partially Observable Dynamical Systems
A novel planning scheme that employs Kalman filtering as a state estimator to obtain a finite-state abstraction of the dynamical system, which is formalized as a Markov decision process (MDP) with intervals of probabilities, which enhances the robustness of the model against numerical imprecision in approximating the transition probabilities.
Approximate Bayesian inference in multivariate Gaussian process regression and applications to species distribution models
This thesis develops hierarchical models, where the vector of predictor functions is assumed to follow a multivariate Gaussian process, and develops new parametrisations for the statistical models in order to improve the performance of the computations related to the inferential task.
Advances in Bayesian inference and stable optimization for large-scale machine learning problems
Advances in Bayesian inference and stable optimization for large-scale machine learning problems
Understanding Expectation Propagation
This report continues work on characterising Expectation Propagation (EP), an approximate Bayesian inference scheme, looking at four toy cases of interest, and finds that EP’s uncertainty estimates do not collapse pathologically as they do for mean field VI.
Expectation Propagation as a Way of Life: A Framework for Bayesian Inference on Partitioned Data
The central idea is to factor the likelihood according to the data partitions, and to iteratively combine each factor with an approximate model of the prior and all other parts of the data thus producing an overall approximation to the global posterior at convergence.
Classified Regression for Bayesian Optimization: Robot Learning with Unknown Penalties
A Bayesian model is proposed that captures exactly what the authors know about the cost of unstable controllers prior to data collection, except that it should be a somewhat large number, and predicts high cost values in regions where failures are likely to occur.
Efficient CDF Approximations for Normalizing Flows
This work builds upon the diffeomorphic properties of normalizing flows and leverage the divergence theorem to estimate the CDF over a closed region in target space in terms of the flux across its boundary, as induced by the normalizing flow.
Patch-Based Image Restoration using Expectation Propagation
Experiments conducted for denoising, inpainting and deconvolution problems with Gaussian and Poisson noise illustrate the potential benefits of such a flexible approximate Bayesian method for uncertainty quantification in imaging problems, at a reduced computational cost compared to sampling techniques.
Spooky Coordinated Tasking and Estimation on Uninformative Priors
This paper explores methodologies for utilizing negative information in this context, with a special focus on the well-known admissible region and a novel methodology for splitting mixands in an arbitrary measurement space.
Two-step Lookahead Bayesian Optimization with Inequality Constraints
It is argued here that being non-myopic is even more important in constrained problems because fear of violating constraints pushes myopic methods away from sampling the boundary between feasible and infeasible regions, slowing the discovery of optimal solutions with tight constraints.


Learning Kernel Classifiers
Computation of Multivariate Normal and t Probabilities
This book describes recently developed methods for accurate and efficient computation of the required probability values for problems with two or more variables.
Power EP
This note describes power EP, an extension of Expectation Propagation that makes the computations more tractable and allows tackling problems which are intractable under regular EP.
An Analysis of the Health and Retirement Status of the Elderly
in this paper we specify and estimate a structural limited dependent variable model with which we study both the health and retirement status of the elderly. Standard linear estimators, which assume
Likelihood inference in a correlated probit regression model
Correlated binary observations arise in a variety of applications. For example, in animal studies the term 'litter effect' is used to describe the greater alikeness of responses within a litter as
Computing Multivariate Normal Probabilities: A New Look
This article describes and compares several numerical methods for finding multivariate probabilities over a rectangle. A large computational study shows how the computation times depend on the
Approximations to Multivariate Normal Rectangle Probabilities Based on Conditional Expectations
Abstract Two new approximations for multivariate normal probabilities for rectangular regions, based on conditional expectations and regression with binary variables, are proposed. One is a
Randomization of Number Theoretic Methods for Multiple Integration
A procedure is discussed for randomization of the number theoretic methods of the Korobov type producing stochastic families of multi-dimensional integration rules. These randomized rules have the