Boosting a weak learning algorithm by majority

@article{Freund1990BoostingAW,
  title={Boosting a weak learning algorithm by majority},
  author={Yoav Freund},
  journal={Inf. Comput.},
  year={1990},
  volume={121},
  pages={256-285}
}
  • Y. Freund
  • Published 1 September 1995
  • Computer Science
  • Inf. Comput.
Abstract We present an algorithm for improving the accuracy of algorithms for learning binary concepts. The improvement is achieved by combining a large number of hypotheses, each of which is generated by training the given learning algorithm on a different set of examples. Our algorithm is based on ideas presented by Schapire and represents an improvement over his results, The analysis of our algorithm provides general upper bounds on the resources required for learning in Valiant′s polynomial… Expand
Connecting Interpretability and Robustness in Decision Trees through Separation
TLDR
This paper rigorously investigates the connection between robustness and interpretability using decision trees and robustness to l1-perturbation, and provides the first algorithm with provable guarantees both on robustness, interpretability, and accuracy in the context of decision trees. Expand
Contribution à la modélisation et l'inférence de réseau de régulation de gènes
TLDR
Les composantes sont estimees via une procedure d’orthogonalisation hierarchique de Gram-Schmidt, visant a construire une approximation of the base analytique, and une procedure of L2-Boosting pour reconstruire une approximation parcimonieuse du signal. Expand
Combining Different Approaches to Improve Arabic Text Documents Classification
TLDR
The results of all models showed that combining classifiers can effectively improve the accuracy of Arabic text documents classification. Expand
Evolvability from learning algorithms
TLDR
It is shown that evolvability is equivalent to learnability by a restricted form of statistical queries, and it is proved that for any fixed distribution D over the instance space, every class of functions learnable by SQs over D is evolvable over D. Expand
Sur l'utilisation active de la diversité dans la construction d'ensembles de classifieurs. Application à la détection de fumées nocives sur site industriel
L'influence de la diversite lors de la construction d'ensembles de classifieurs a souleve de nombreuses discussions au sein de la communaute de l'Apprentissage Automatique ces dernieres annees. UneExpand
Robust Boosting via Convex Optimization: Theory and Applications
TLDR
It is shown that boosting can be used to solve large scale constrained optimization problems, whose solutions are well characterizable and derive convergence guarantees for a quite general family of boosting algorithms. Expand
On the Existence of Linear Weak Learners and Applications to Boosting
TLDR
This work shows that under certain natural conditions on the data set, a linear classifier is indeed a weak learner, and can be directly applied to generalization error bounds for boosting, leading to closed-form bounds. Expand
Boosting with Diverse Base Classifiers
TLDR
Cross-validation experiments that suggest that Boost-by-Majority can be the basis of a practically useful learning method, often improving on the generalization of AdaBoost on large datasets are described. Expand
Boosted Histogram Transform for Regression
TLDR
A boosting algorithm for regression problems called boosted histogram transform for regression (BHTR) based on histogram transforms composed of random rotations, stretchings, and translations is proposed, which shows promising performance on both synthetic and real datasets. Expand
Towards a combinatorial characterization of bounded memory learning
TLDR
This paper proposes a candidate solution for the case of realizable strong learning under a known distribution, based on the SQ dimension of neighboring distributions, that match in some regime of parameters. Expand
...
1
2
3
4
5
...