Vanessa Gómez-Verdejo

Learn More
Among all adaptive filtering algorithms, Widrow and Hoff’s Least Mean Square (LMS) has probably become the most popular because of its robustness, good tracking properties and simplicity. A drawback of LMS is that the step size implies a compromise between speed of convergence and final misadjustment. To combine different speed LMS filters serves to(More)
FMRI data are acquired as complex-valued spatiotemporal images. Despite the fact that several studies have identified the presence of novel information in the phase images, they are usually discarded due to their noisy nature. Several approaches have been devised to incorporate magnitude and phase data, but none of them has performed between-group inference(More)
The classification of functional or high-dimensional data requires to select a reduced subset of features among the initial set, both to help fighting the curse of dimensionality and to help interpreting the problem and the model. The mutual information criterion may be used in that context, but it suffers from the difficulty of its estimation through a(More)
Progressively emphasizing samples that are difficult to classify correctly is the base for the recognized high performance of real Adaboost (RA) ensembles. The corresponding emphasis function can be written as a product of a factor that measures the quadratic error and a factor related to the proximity to the classification border; this fact opens the door(More)
The Least Mean Square (LMS) algorithm has become a very popular algorithm for adaptive filtering due to its robustness and simplicity. An adaptive convex combination of one fast a one slow LMS filters has been previously proposed for plant identification, as a way to break the speed vs precision compromise inherent to LMS filters. In this paper, an improved(More)
Real Adaboost ensembles with weighted emphasis (RA-we) on erroneous and critical (near the classification boundary) samples have recently been proposed, leading to improved performance when an adequate combination of these terms is selected. However, finding the optimal emphasis adjustment is not an easy task. In this paper, we propose to make a fusion of(More)
Real Adaboost is a well-known and good performance boosting method used to build machine ensembles for classification. Considering that its emphasis function can be decomposed in two factors that pay separated attention to sample errors and to their proximity to the classification border, a generalized emphasis function that combines both components by(More)
In this correspondence, we derive an online adaptive one-class support vector machine. The machine structure is updated via growing and pruning mechanisms and the weights are updated using structural risk minimization principles underlying support vector machines. Our approach leads to very compact machines compared to other online kernel methods whose(More)
This paper shows that new and flexible criteria to resample populations in boosting algorithms can lead to performance improvements. Real Adaboost emphasis function can be divided into two different terms, the first only pays attention to the quadratic error of each pattern and the second takes only into account the “proximity” of each pattern to the(More)
In the present study we applied a multivariate feature selection method based on the analysis of the sign consistency of voxel weights across bagged linear Support Vector Machines (SVMs) with the aim of detecting brain regions relevant for the discrimination of subjects with obsessive-compulsive disorder (OCD, n=86) from healthy controls (n=86). Each(More)