Learn More
Among all adaptive filtering algorithms, Widrow and Hoff's Least Mean Square (LMS) has probably become the most popular because of its robustness, good tracking properties and simplicity. A drawback of LMS is that the step size implies a compromise between speed of convergence and final misadjust-ment. To combine different speed LMS filters serves to(More)
In this correspondence, we derive an online adaptive one-class support vector machine. The machine structure is updated via growing and pruning mechanisms and the weights are updated using structural risk minimization principles underlying support vector machines. Our approach leads to very compact machines compared to other online kernel methods whose(More)
Real Adaboost is a well-known and good performance boosting method used to build machine ensembles for classification. Considering that its emphasis function can be decomposed in two factors that pay separated attention to sample errors and to their proximity to the classification border, a generalized emphasis function that combines both components by(More)
Progressively emphasizing samples that are difficult to classify correctly is the base for the recognized high performance of real Adaboost (RA) ensembles. The corresponding emphasis function can be written as a product of a factor that measures the quadratic error and a factor related to the proximity to the classification border; this fact opens the door(More)
The classification of functional or high-dimensional data requires to select a reduced subset of features among the initial set, both to help fighting the curse of dimensionality and to help interpreting the problem and the model. The mutual information criterion may be used in that context, but it suffers from the difficulty of its estimation through a(More)
This paper introduces a new support vector machine (SVM) formulation to obtain sparse solutions in the primal SVM parameters, providing a new method for feature selection based on SVMs. This new approach includes additional constraints to the classical ones that drop the weights associated to those features that are likely to be irrelevant. A ν-SVM(More)
This paper shows that new and flexible criteria to resample populations in boosting algorithms can lead to performance improvements. Real Adaboost emphasis function can be divided into two different terms, the first only pays attention to the quadratic error of each pattern and the second takes only into account the " proximity " of each pattern to the(More)
In this brief, we propose to increase the capabilities of standard real AdaBoost (RAB) architectures by replacing their linear combinations with a fusion controlled by a gate with fixed kernels. Experimental results in a series of well-known benchmark problems support the effectiveness of this approach in improving classification performance. Although the(More)
This work explores the possibility of improving the performance of Real Adaboost ensemble classifiers by replacing their standard linear combination of learners by a gating scheme. This more powerful fusion method is defined following the epoch-by-epoch construction of boosting ensembles. Preliminary experimental results support the potential of this new(More)