A Further Comparison of Splitting Rules for Decision-Tree Induction

@article{Buntine1992AFC,
  title={A Further Comparison of Splitting Rules for Decision-Tree Induction},
  author={Wray L. Buntine and Tim Niblett},
  journal={Machine Learning},
  year={1992},
  volume={8},
  pages={75-85}
}
One approach to learning classification rules from examples is to build decision trees. A review and comparison paper by Mingers (Mingers, 1989) looked at the first stage of tree building, which uses a “splitting rule” to grow trees with a greedy recursive partitioning algorithm. That paper considered a number of different measures and experimentally examined their behavior on four domains. The main conclusion was that a random splitting rule does not significantly decrease classificational… CONTINUE READING

Similar Papers

Citations

Publications citing this paper.
SHOWING 1-10 OF 170 CITATIONS, ESTIMATED 97% COVERAGE

Very Simple Classification Rules Perform Well on Most Commonly Used Datasets

  • Machine Learning
  • 1993
VIEW 9 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Principles of Data Mining and Knowledge Discovery

  • Lecture Notes in Computer Science
  • 1999
VIEW 18 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

A Comparative Analysis of Methods for Pruning Decision Trees

  • IEEE Trans. Pattern Anal. Mach. Intell.
  • 1997
VIEW 6 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

The Importance of Attribute Selection Measures in Decision Tree Induction

  • Machine Learning
  • 1994
VIEW 4 EXCERPTS
CITES RESULTS & BACKGROUND
HIGHLY INFLUENCED

The importance of attribute selection measures in decision tree induction

  • Machine Learning
  • 1994
VIEW 4 EXCERPTS
CITES RESULTS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

1991
2019

CITATION STATISTICS

  • 13 Highly Influenced Citations

  • Averaged 4 Citations per year over the last 3 years