Learn More
Feature selection for classification in high-dimensional spaces can improve generalization, reduce classifier complexity, and identify important, discriminating feature ¿markers.¿ For support vector machine (SVM) classification, a widely used technique is recursive feature elimination (RFE). We demonstrate that RFE is not consistent with margin(More)
Alzheimer's disease (AD) and mild cognitive impairment (MCI) are of great current research interest. While there is no consensus on whether MCIs actually "convert" to AD, this concept is widely applied. Thus, the more important question is not whether MCIs convert, but what is the best such definition. We focus on automatic prognostication, nominally using(More)
Feature selection for classification working in high-dimensional feature spaces can improve generalization accuracy, reduce clas-sifier complexity, and is also useful for identifying the important feature " markers " , e.g., biomarkers in a bioinformatics or biomed-ical context. For support vector machine (SVM) classification, a widely used feature(More)
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing(More)
We address feature selection for support vector machines for the scenario in which the feature space is huge, i.e., 10<sup>5</sup> - 10<sup>6</sup> or more features, as may occur e.g. in a biomedical context working with 3-D (or 4-D) brain images. Feature selection in this case may be needed to improve the classifier's generalization performance (given(More)
In this paper we investigate application of the recently developed margin-based feature elimination (MFE) method for feature selection in support vector machines to high-dimensional, small sample size data from the DNA microarray domain. We compared the performance of MFE to the well-known recursive feature elimination (RFE) method. Our results show that(More)
Margin maximization in the hard-margin sense, proposed as feature elimination criterion by the MFE-LO method, is combined here with data radius utilization to further aim to lower generalization error, as several published bounds and bound-related formulations pertaining to lowering misclassification risk (or error) pertain to radius e.g. product of squared(More)
  • 1