• Corpus ID: 249461946

Bayesian additive regression trees for probabilistic programming

  title={Bayesian additive regression trees for probabilistic programming},
  author={Miriana Quiroga and Pablo G. Garay and Juan M. Alonso and Juan Mart{\'i}n Loyola and Osvaldo A. Martin},
Bayesian additive regression trees (BART) is a non-parametric method to approximate functions. It is a black-box method based on the sum of many trees where priors are used to regularize inference, mainly by restricting trees’ learning capacity so that no individual tree is able to explain the data, but rather the sum of trees. We discuss BART in the context of probabilistic programming languages (PPLs), i.e. we present BART as a primitive that can be used as a component of a probabilistic… 

BART: Bayesian Additive Regression Trees

We develop a Bayesian "sum-of-trees" model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian

Particle Gibbs for Bayesian Additive Regression Trees

A novel sampler for BART based on the Particle Gibbs (PG) algorithm and a top-down particle filtering algorithm for Bayesian decision trees (Lakshminarayanan et al., 2013) that outperforms existing samplers in many settings.

Bayesian Additive Regression Trees

We develop a Bayesian \sum-of-trees" model where each tree is constrained by a regularization prior to be a weak learner, and fltting and inference are accomplished via an iterative Bayesian

GP-BART: a novel Bayesian additive regression trees approach using Gaussian processes

Gaussian processes Bayesian additive regression trees (GP-BART) is proposed as an extension of BART which assumes Gaussian process (GP) priors for the predictions of each terminal node among all trees.

Generalized Bayesian Additive Regression Trees Models: Beyond Conditional Conjugacy

This article greatly expands the domain of applicability of BART to arbitrary generalized BART models by introducing a very simple, tuning-parameter-free, reversible jump Markov chain Monte Carlo algorithm.

Bayesian additive regression trees and the General BART model

A framework is discussed, the General BART model that unifies some of the recent BART extensions, including semiparametric models, correlated outcomes, and statistical matching problems in surveys, and models with weaker distributional assumptions.

Bayesian Regression Trees for High-Dimensional Prediction and Variable Selection

A Bayesian point of view is taken and how to construct priors on decision tree ensembles that are capable of adapting to sparsity in the predictors by placing a sparsity-inducing Dirichlet hyperprior on the splitting proportions of the regression tree prior is shown.

Bayesian Additive Regression Trees: A Review and Look Forward

The basic approach to BART is presented and further development of the original algorithm that supports a variety of data structures and assumptions are discussed, including augmentations of the prior specification to accommodate higher dimensional data and smoother functions.

Bayesian regression tree ensembles that adapt to smoothness and sparsity

  • A. LineroYun Yang
  • Computer Science
    Journal of the Royal Statistical Society: Series B (Statistical Methodology)
  • 2018
This work implements sparsity inducing soft decision trees in which the decisions are treated as probabilistic and adapts to the unknown smoothness and sparsity levels, and can be implemented by making minimal modifications to existing Bayesian additive regression tree algorithms.

Efficient Metropolis–Hastings Proposal Mechanisms for Bayesian Regression Tree Models

This paper develops novel proposal mechanisms for efficient sampling in the Bayesian Additive Regression Tree (BART) model and implements this sampling algorithm in the model and demonstrates its effectiveness on a prediction problem from computer experiments and a test function where structural tree variability is needed to fully explore the posterior.