# BayesCG As An Uncertainty Aware Version of CG

@inproceedings{Reid2020BayesCGAA, title={BayesCG As An Uncertainty Aware Version of CG}, author={Tim W. Reid and Ilse C. F. Ipsen and Jon Cockayne and Chris. J. Oates}, year={2020} }

. The Bayesian Conjugate Gradient method (BayesCG) is a probabilistic generalization of the Conjugate Gradient method (CG) for solving linear systems with real symmetric positive definite coefficient matrices. Our CG-based implementation of BayesCG under a structure-exploiting prior distribution represents an ’uncertainty-aware’ version of CG. Its output consists of CG iterates and posterior covariances that can be propagated to subsequent computations. The covariances have low-rank and are…

## Figures from this paper

## 4 Citations

### Statistical Properties of the Probabilistic Numeric Linear Solver BayesCG

- Computer ScienceArXiv
- 2022

This work analyses the calibration of BayesCG under the Krylov prior, a probabilistic numeric extension of the Conjugate Gradient method for solving systems of linear equations with symmetric positive deﬁnite coeﬃcient matrix and proposes two test statistics that are necessary but not suﬂcient for calibration: the Z -statistic and the new S-statistic.

### Posterior and Computational Uncertainty in Gaussian Processes

- Computer ScienceArXiv
- 2022

A new class of methods is developed that provides consistent estimation of the combined uncertainty arising from both the finite number of data observed and the finite amount of computation expended, and the consequences of ignoring computational uncertainty are demonstrated.

### ProbNum: Probabilistic Numerics in Python

- Computer ScienceArXiv
- 2021

Probabilistic numerical methods (PNMs) solve numerical problems via probabilistic inference. They have been developed for linear algebra, optimization, integration and differential equation…

### Probabilistic Iterative Methods for Linear Systems

- Computer ScienceJ. Mach. Learn. Res.
- 2021

This paper presents a probabilistic perspective on iterative methods for approximating the solution x ∈ Rd of a nonsingular linear system Ax = b, and characterises both the rate of contraction of μm to an atomic measure on x and the nature of the uncertainty quantification being provided.

## References

SHOWING 1-10 OF 68 REFERENCES

### A Bayesian Conjugate Gradient Method (with Discussion)

- Computer Science
- 2019

This paper proposes a novel statistical model for this error, set in a Bayesian framework, and is a strict generalisation of the conjugate gradient method, which is recovered as the posterior mean for a particular choice of prior.

### Probabilistic Iterative Methods for Linear Systems

- Computer ScienceJ. Mach. Learn. Res.
- 2021

This paper presents a probabilistic perspective on iterative methods for approximating the solution x ∈ Rd of a nonsingular linear system Ax = b, and characterises both the rate of contraction of μm to an atomic measure on x and the nature of the uncertainty quantification being provided.

### Testing whether a Learning Procedure is Calibrated

- Computer Science
- 2020

This paper studies conditions for a learning procedure to be considered calibrated, in the sense that the true data-generating parameters are plausible as samples from its distributional output.

### Probabilistic Linear Solvers for Machine Learning

- Computer ScienceNeurIPS
- 2020

It is demonstrated how to incorporate prior spectral information in order to calibrate uncertainty and experimentally showcase the potential of probabilistic linear solvers for machine learning.

### Randomized algorithms for generalized singular value decomposition with application to sensitivity analysis

- Computer ScienceNumer. Linear Algebra Appl.
- 2021

New randomized algorithms for computing the GSVD which use randomized subspace iteration and weighted QR factorization are proposed, motivated by applications in hyper‐differential sensitivity analysis (HDSA).

### Integrals over Gaussians under Linear Domain Constraints

- Computer Science, MathematicsAISTATS
- 2020

An efficient black-box algorithm that exploits geometry for the estimation of integrals over a small, truncated Gaussian volume, and to simulate therefrom, using the Holmes-Diaconis-Ross (HDR) method combined with an analytic version of elliptical slice sampling (ESS).

### HYPERDIFFERENTIAL SENSITIVITY ANALYSIS OF UNCERTAIN PARAMETERS IN PDE-CONSTRAINED OPTIMIZATION

- Computer ScienceInternational Journal for Uncertainty Quantification
- 2020

This article introduces "hyper-differential sensitivity analysis", a goal-oriented analysis which considers the sensitivity of the solution of a PDE-constrained optimization problem to uncertain parameters and formally defines hyper- differential sensitivity indices.