• Corpus ID: 219558976

Bayesian Probabilistic Numerical Integration with Tree-Based Models

@article{Zhu2020BayesianPN,
  title={Bayesian Probabilistic Numerical Integration with Tree-Based Models},
  author={Harrison Zhu and Xing Liu and Ruya Kang and Zhichao Shen and Seth Flaxman and F. Briol},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.05371}
}
Bayesian quadrature (BQ) is a method for solving numerical integration problems in a Bayesian manner, which allows user to quantify their uncertainty about the solution. The standard approach to BQ is based on Gaussian process (GP) approximation of the integrand. As a result, BQ approach is inherently limited to cases where GP approximations can be done in an efficient manner, thus often prohibiting high-dimensional or non-smooth target functions. This paper proposes to tackle this issue with a… 

Figures and Tables from this paper

Hierarchical Bayesian Bootstrap for Heterogeneous Treatment Effect Estimation

A major focus of causal inference is the estimation of heterogeneous average treatment effects (HTE) - average treatment effects within strata of another variable of interest. This involves

References

SHOWING 1-10 OF 112 REFERENCES

Testing multidimensional integration routines

Information Rates of Nonparametric Gaussian Process Methods

TLDR
The results show that for good performance, the regularity of the GP prior should match the regularities of the unknown response function, and is expressible in a certain concentration function.

Convergence Guarantees for Gaussian Process Approximations Under Several Observation Models

TLDR
The main novelty in this paper is that the results cover a wide range of observation models including interpolation, approximation with deterministic corruption and regression with Gaussian noise.

BART: Bayesian Additive Regression Trees

We develop a Bayesian "sum-of-trees" model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian

Convergence Guarantees for Gaussian Process Means With Misspecified Likelihoods and Smoothness

TLDR
This paper describes how the experimental design and choice of kernel and kernel hyperparameters can be adapted to alleviate model misspecification.

Adaptive multidimensional integration: vegas enhanced

  • G. Lepage
  • Computer Science
    J. Comput. Phys.
  • 2021

The art of BART: On flexibility of Bayesian forests

TLDR
This work provides a comprehensive study of asymptotic optimality and posterior contraction of Bayesian forests when the regression function has anisotropic smoothness that possibly varies over the function domain.

Adaptive Quadrature Schemes for Bayesian Inference via Active Learning

TLDR
This work considers an interpolative approach for building a surrogate posterior density, combining it with Monte Carlo sampling methods and other quadrature rules, and introduces two specific schemes based on Gaussian and Nearest Neighbors bases.

Model Evidence with Fast Tree Based Quadrature

TLDR
Tree Quadrature (TQ) is presented, which places no qualifications on how the samples provided to it are obtained, allowing it to use state-of-the-art sampling algorithms that are largely ignored by existing integration algorithms.

Uncertainty Quantification for Sparse Deep Learning

TLDR
This paper provides semi-parametric Bernstein-von Mises theorems for linear and quadratic functionals, which guarantee that implied Bayesian credible regions have valid frequentist coverage and provides new theoretical justifications for (Bayesian) deep learning with ReLU activation functions.
...