1 Improved Class Probability Estimates from Decision Tree Models

@inproceedings{Margineantu20011IC,
  title={1 Improved Class Probability Estimates from Decision Tree Models},
  author={Dragos D. Margineantu and Thomas G. Dietterich},
  year={2001}
}
Decision tree models typically give good classification decisions but poor probability estimates. In many applications, it is important to have good probability estimates as well. This paper introduces a new algorithm, Bagged Lazy Option Trees (B-LOTs), for constructing decision trees and compares it to an alternative, Bagged Probability Estimation Trees (B-PETs). The quality of the class probability estimates produced by the two methods is evaluated in two ways. First, we compare the ability… CONTINUE READING
Highly Cited
This paper has 39 citations. REVIEW CITATIONS

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.

References

Publications referenced by this paper.
Showing 1-10 of 27 references

Well-Trained PETs : Improving Probability Estimation

New York UniversityPedro Domingos TreesFoster Provost, Olaf Tober
2000
View 3 Excerpts
Highly Influenced

Random Forests

Machine Learning • 2001
View 1 Excerpt

Friedman , Ron Kohavi , and Yeogirl Yun . Lazy decision trees

H Jerome
2000

A Bayesian CART algorithm

D.G.T. Denison, B. K. Mallick, A.F.M. Smith
Biometrika, 85:363–377 • 1998
View 2 Excerpts

Bayesian CART model search (with discussion)

H. Chipman, E. George, R. McCulloch
Journal of the American Statistical Association, 93:935– 960 • 1998
View 2 Excerpts

Dietterich . An experimental comparison of three methods for constructing ensembles of decision trees : Bagging , boosting , and randomization

G Thomas
Machine Learning • 1998

Similar Papers

Loading similar papers…