• Corpus ID: 231632729

Yet Another Representation of Binary Decision Trees: A Mathematical Demonstration

@article{Zhang2021YetAR,
  title={Yet Another Representation of Binary Decision Trees: A Mathematical Demonstration},
  author={Jinxiong Zhang},
  journal={ArXiv},
  year={2021},
  volume={abs/2101.07077}
}
A decision tree looks like a simple computational graph without cycles, where only the leaf nodes specify the output values and the non-terminals specify their tests or split conditions. From the numerical perspective, we express decision trees in the language of computational graph. We explicitly parameterize the test phase, traversal phase and prediction phase of decision trees based on the bitvectors of non-terminal nodes. As shown, the decision tree is a shallow binary network in some sense… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 46 REFERENCES
QuickScorer: Efficient Traversal of Large Ensembles of Decision Trees
TLDR
QuickScorer is presented, a novel algorithm for the traversal of huge decision trees ensembles that, thanks to a cache- and CPU-aware design, provides a speedup over best competitors.
Efficient Non-greedy Optimization of Decision Trees
TLDR
It is shown that the problem of finding optimal linear-combination splits for decision trees is related to structured prediction with latent variables, and a convex-concave upper bound on the tree's empirical loss is formed, and the use of stochastic gradient descent for optimization enables effective training with large datasets.
The Tree Ensemble Layer: Differentiability meets Conditional Computation
TLDR
This work introduces a new layer for neural networks, composed of an ensemble of differentiable decision trees, and provides an open-source TensorFlow implementation with a Keras API.
Deep Neural Decision Trees
TLDR
This work presents Deep Neural Decision Trees (DNDT) -- tree models realised by neural networks, which can be easily implemented in NN toolkits, and trained with gradient descent rather than greedy splitting.
Deep Neural Decision Forests
TLDR
A novel approach that unifies classification trees with the representation learning functionality known from deep convolutional networks, by training them in an end-to-end manner by introducing a stochastic and differentiable decision tree model.
End-to-End Learning of Decision Trees and Forests
TLDR
This work presents an end-to-end learning scheme for deterministic decision trees and decision forests that can learn more complex split functions than common oblique ones, and facilitates interpretability through spatial regularization.
Applications of forbidden 0-1 matrices to search tree and path compression-based data structures
TLDR
This paper improves, reprove, and simplify several theorems on the performance of data structures based on path compression and search trees, and presents the first asymptotically sharp bound on the length of arbitrary path compressions on arbitrary trees.
Entropy Nets: From Decision Trees to Neural Networks
TLDR
This paper shows how the mapping of decision trees into a multilayer neural network structure can be exploited for the systematic design of a class of layered neural networks, called entropy nets, that have far fewer connections.
Learning Certifiably Optimal Rule Lists for Categorical Data
TLDR
The results indicate that it is possible to construct optimal sparse rule lists that are approximately as accurate as the COMPAS proprietary risk prediction tool on data from Broward County, Florida, but that are completely interpretable.
Hybrid decision tree
...
...