Learn More
Bagging is an ensemble method that uses random resampling of a dataset to construct models. In classification scenarios, the random resampling procedure in bagging induces some classification margin over the dataset. In addition, when perform bagging in different feature subspaces, the resulting classification margins are likely to be diverse. We take into(More)
This paper studies the problem whether the smallest eigenvalue of constrained linear combinations of symmetric matrices can reach a desirable value, which actually extends the mathematical problem of finding a positive definite linear combination of symmetric matrices(PDLC), and provides a universal framework to maximize the minimal eigenvalue of linear(More)
AdaBoost is a well known boosting method for generating strong ensemble of weak base learners. The procedure of AdaBoost can be fitted in a gradient descent optimization framework, which is important for analyzing and devising its procedure. Cost sensitive boosting (CSB) is an emerging subject extending the boosting methods for cost sensitive classification(More)
  • 1