Gradient Boosting With Piece-Wise Linear Regression Trees

  title={Gradient Boosting With Piece-Wise Linear Regression Trees},
  author={Yu Shi and J. Li and Z. Li},
Gradient Boosted Decision Trees (GBDT) is a very successful ensemble learning algorithm widely used across a variety of applications. Recently, several variants of GBDT training algorithms and implementations have been designed and heavily optimized in some very popular open sourced toolkits including XGBoost, LightGBM and CatBoost. In this paper, we show that both the accuracy and efficiency of GBDT can be further enhanced by using more complex base learners. Specifically, we extend gradient… Expand
4 Citations
Gradient Boosted Trees with Extrapolation
  • Alexey Malistov, A. Trushin
  • Computer Science
  • 2019 18th IEEE International Conference On Machine Learning And Applications (ICMLA)
  • 2019
  • Highly Influenced
An Extension of Gradient Boosted Decision Tree Incorporating Statistical Tests
  • 2
StackPDB: predicting DNA-binding proteins based on XGB-RFE feature optimization and stacked ensemble classifier
  • 1
  • Highly Influenced
  • PDF


LightGBM: A Highly Efficient Gradient Boosting Decision Tree
  • 1,657
  • PDF
Special Invited Paper-Additive logistic regression: A statistical view of boosting
  • 4,157
  • PDF
GPU-acceleration for Large-scale Tree Boosting
  • 36
  • PDF
XGBoost: A Scalable Tree Boosting System
  • 7,851
  • Highly Influential
  • PDF
Deep Neural Decision Forests
  • 293
  • PDF
Scalable look-ahead linear regression trees
  • 29
  • PDF
Parallel boosted regression trees for web search ranking
  • 140
  • Highly Influential
  • PDF
Greedy function approximation: A gradient boosting machine.
  • 10,396
  • PDF
CatBoost: unbiased boosting with categorical features
  • 367
  • Highly Influential
  • PDF