Is Combining Classifiers Better than Selecting the Best One

@inproceedings{Dzeroski2002IsCC,
  title={Is Combining Classifiers Better than Selecting the Best One},
  author={Saso Dzeroski and Bernard Zenko},
  booktitle={ICML},
  year={2002}
}
We empirically evaluate several state-of-theart methods for constructing ensembles of heterogeneous classifiers with stacking and show that they perform (at best) comparably to selecting the best classifier from the ensemble by cross validation. We then propose a new method for stacking, that uses multi-response model trees at the meta-level, and show that it clearly outperforms existing stacking approaches and selecting the best classifier by cross validation. 
Highly Cited
This paper has 161 citations. REVIEW CITATIONS

From This Paper

Figures, tables, and topics from this paper.

Explore Further: Topics Discussed in This Paper

Citations

Publications citing this paper.
Showing 1-10 of 69 extracted citations

GA-stacking: Evolutionary stacked generalization

View 11 Excerpts
Method Support
Highly Influenced

161 Citations

01020'01'04'08'12'16
Citations per Year
Semantic Scholar estimates that this publication has 161 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 18 references

Issues in Stacked Generalization

J. Artif. Intell. Res. • 1999
View 5 Excerpts
Highly Influenced

Using Correspondence Analysis to Combine Classifiers

Machine Learning • 1999
View 5 Excerpts
Highly Influenced

Using Model Trees for Classification

View 4 Excerpts
Highly Influenced

The Power of Decision Tables

ECML • 1995
View 2 Excerpts
Highly Influenced

Stacked generalization

Neural Networks • 1992
View 3 Excerpts
Highly Influenced