Corpus ID: 14669208

Arcing Classifiers

@inproceedings{Breiman1998ArcingC,
  title={Arcing Classifiers},
  author={L. Breiman},
  year={1998}
}
Recent work has shown that combining multiple versions of unstable classifiers such as trees or neural nets results in reduced test set error. One of the more effective is bagging (Breiman [1996a]) Here, modified training sets are formed by resampling from the original training set, classifiers constructed using these training sets and then combined by voting. Freund and Schapire [1995,1996] propose an algorithm the basis of which is to adaptively resample and combine (hence the acronym-arcing… Expand
709 Citations
An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants
  • 2,337
  • Highly Influenced
  • PDF
A new approach of Boosting using decision tree classifier for classifying noisy data
  • 5
A variant of Rotation Forest for constructing ensemble classifiers
  • 26
A new ensemble classifier creation method by creating new training set for each base classifier
  • 4
Boosting Neural Networks
  • 251
  • PDF
A Diversity Production Approach in Ensemble of Base Classifiers
  • 1
  • PDF
Improving the Predictive Power of AdaBoost: A Case Study in Classifying Borrowers
  • 6
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 16 REFERENCES
Experiments with a New Boosting Algorithm
  • 8,046
  • PDF
Boosting Decision Trees
  • 252
The strength of weak learnability
  • 1,367
  • PDF
Bias Plus Variance Decomposition for Zero-One Loss Functions
  • 680
  • PDF
Efficient Pattern Recognition Using a New Transformation Distance
  • 578
  • PDF
Neural Networks and the Bias/Variance Dilemma
  • 3,264
  • PDF
Cryptographic Limitations on Learning Boolean Formulae and Finite Automata
  • 603
  • PDF
Classification and Regression Trees
  • 17,534
  • PDF
...
1
2
...