Logistic Model Trees

@article{Landwehr2005LogisticMT,
  title={Logistic Model Trees},
  author={Niels Landwehr and Mark A. Hall and Eibe Frank},
  journal={Machine Learning},
  year={2005},
  volume={59},
  pages={161-205}
}
Tree induction methods and linear models are popular techniques for supervised learning tasks, both for the prediction of nominal classes and numeric values. For predicting numeric quantities, there has been work on combining these two schemes into ‘model trees’, i.e. trees that contain linear regression functions at the leaves. In this paper, we present an algorithm that adapts this idea for classification problems, using logistic regression instead of linear regression. We use a stagewise… Expand
Logistic Model Tree : A Survey Mr .
Tree induction methods and linear models are popular techniques for supervised learning tasks, both for the prediction of nominal classes and numeric values. For predicting numeric quantities, thereExpand
Speeding Up Logistic Model Tree Induction
TLDR
This work addresses the issue of overfitting of logistic model trees by using the AIC criterion instead of cross-validation to prevent overfitting, and a weight trimming heuristic is used which produces a significant speedup. Expand
Fast incremental learning of logistic model tree using least angle regression
TLDR
The proposed algorithm is not only accurate and intuitively interpretable but also computationally efficient and helps users in making the best possible use of the data that are included in expert and intelligent systems. Expand
Rotation-based model trees for classification
  • S. Kotsiantis
  • Mathematics, Computer Science
  • Int. J. Data Anal. Tech. Strateg.
  • 2010
TLDR
A comparison with other well-known ensembles of decision trees on standard benchmark data sets is performed, and the performance of the proposed technique was greater in most cases. Expand
Learning Random Model Trees for Regression
TLDR
An improved model tree algorithm via introducing randomness into the process of building model trees is single out, called RMT, which provides an effective data mining algorithm for applications especially when high-accuracy regression is required. Expand
Model tree pruning
TLDR
The experimental results and algorithmic analysis show that, with respect to the ELM model tree, postpruning achieves better performance than does prepruning, which has previously been universally regarded as one of the most popular decision tree generation strategies. Expand
Logistic Model Tree With Modified AIC
Logistic Model Trees have been shown to be very accurate and compact classifiers. Their greatest disadvantage is the computational complexity of inducing the logistic regression models in the tree.Expand
Stepwise Induction of Logistic Model Trees
TLDR
A novel Logistic Model Tree induction system, SILoRT, is proposed, which induces trees with two types of nodes: regression nodes, which perform only univariate logistic regression, and splitting node, which partition the feature space. Expand
Measure Inducing Classification and Regression Trees for Functional Data
We propose a tree-based algorithm for classification and regression problems in the context of functional data analysis, which allows to leverage representation learning and multiple splitting rulesExpand
SMT: Sparse multivariate tree
TLDR
The recursive partitioning idea of a simple decision tree combined with the intrinsic feature selection of L1 regularized logistic regression LR at each node is a natural choice for a multivariate tree model that is simple, but broadly applicable. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 40 REFERENCES
Logistic Model Trees
TLDR
This paper uses a stagewise fitting process to construct the logistic regression models that can select relevant attributes in the data in a natural way, and shows how this approach can be used to build the logistics regression models at the leaves by incrementally refining those constructed at higher levels in the tree. Expand
Tree Induction Vs Logistic Regression: A Learning Curve Analysis
TLDR
A large-scale experimental comparison of logistic regression and tree induction is presented, assessing classification accuracy and the quality of rankings based on class-membership probabilities, and a learning-curve analysis is used to examine the relationship of these measures to the size of the training set. Expand
Using Model Trees for Classification
TLDR
Surprisingly, using this simple transformation the model tree inducer M5′, based on Quinlan's M5, generates more accurate classifiers than the state-of-the-art decision tree learner C5.0, particularly when most of the attributes are numeric. Expand
Functional Trees
TLDR
This work introduces a simple unifying framework for multivariate tree learning that combines a univariate decision tree with a linear function by means of constructive induction and uses the bias-variance decomposition of the error, cluster analysis, and learning curves as tools for analysis. Expand
Trading-Off Local versus Global Effects of Regression Nodes in Model Trees
TLDR
In this paper a method for the top-down induction of model trees is presented, namely the Stepwise Model Tree Induction (SMOTI) method, whose main characteristic is the induction of trees with two types of nodes: regression nodes which perform only straight-line regression, and split nodes, which partition the sample space. Expand
LOTUS: An Algorithm for Building Accurate and Comprehensible Logistic Regression Trees
Logistic regression is a powerful technique for fitting models to data with a binary response variable, but the models are difficult to interpret if collinearity, nonlinearity, or interactions areExpand
Scaling Up the Accuracy of Naive-Bayes Classifiers: A Decision-Tree Hybrid
  • R. Kohavi
  • Mathematics, Computer Science
  • KDD
  • 1996
TLDR
A new algorithm, NBTree, is proposed, which induces a hybrid of decision-tree classifiers and Naive-Bayes classifiers: the decision-Tree nodes contain univariate splits as regular decision-trees, but the leaves contain Naïve-Bayesian classifiers. Expand
Additive Logistic Regression : a Statistical
TLDR
This work develops more direct approximations of boosting that exhibit performance comparable to other recently proposed multi-class generalizations of boosting, and suggests a minor modiication to boosting that can reduce computation, often by factors of 10 to 50. Expand
Tree Structured Interpretable Regression
TLDR
The method can naturally be applied to very large datasets in which only a small proportion of the predictors are useful, the resulting regression rules are more easily interpreted and applied, and may be more accurate in application. Expand
A Comparison of Prediction Accuracy, Complexity, and Training Time of Thirty-Three Old and New Classification Algorithms
TLDR
Among decision tree algorithms with univariate splits, C4.5, IND-CART, and QUEST have the best combinations of error rate and speed, but C 4.5 tends to produce trees with twice as many leaves as those fromIND-Cart and QUEST. Expand
...
1
2
3
4
...