Feature Selection for Automatic Classification of Non-Gaussian Data
@article{Foroutan1987FeatureSF, title={Feature Selection for Automatic Classification of Non-Gaussian Data}, author={Iman Foroutan and Jack Sklansky}, journal={IEEE Transactions on Systems, Man, and Cybernetics}, year={1987}, volume={17}, pages={187-198} }
A computer-based technique for automatic selection of features for the classification of non-Gaussian data is presented. The selection technique exploits interactive cluster finding and a modified branch and bound optimization of piecewise linear classifiers. The technique first finds an efficient set of pairs of oppositely classified clusters to represent the data. Then a zero-one implicit enumeration implements a branch and bound search for a good subset of features. A test of the feature…
Figures and Tables from this paper
96 Citations
Computational discovery for nonlinear classifiers
- Computer Science[Proceedings] 1992 IEEE International Conference on Systems, Man, and Cybernetics
- 1992
The author describes an approach to computer-aided discovery of nonlinear classifiers and cluster finders for complex distributions of multidimensional data by combining algebraic transformations, hill climbing, and genetic search.
Comparison of algorithms that select features for pattern classifiers
- Computer SciencePattern Recognit.
- 2000
Feature Selection Algorithm for Multiple Classifier Systems: A Hybrid Approach
- Computer ScienceFundam. Informaticae
- 2008
The main aim of this paper is to present ways to improve RBFS algorithm which is a feature selection algorithm, and propose a new algorithm called ARS algorithm, which is to decrease the number of the decision-relative reducts for a decision table.
Optimum feature selection for decision functions
- Computer Science
- 1990
An optimum feature selection method which is applicable to arbitrary (nonlinear) decision functions is presented and numerical examples of feature selection for a linear and a quadratic decision function are presented.
Designing Relevant Features for Continuous Data Sets Using ICA
- Computer ScienceInt. J. Comput. Intell. Appl.
- 2008
A novel approach to reducing dimensionality of the feature space by employing independent component analysis (ICA) is introduced, which efficiently builds a reduced set of features without loss in accuracy and also has a fast incremental version.
Pattern classification in dynamic environments: tagged feature-class representation and the classifiers
- Computer ScienceIEEE Trans. Syst. Man Cybern.
- 1989
The author discusses: a tagged feature and class representation of the pattern recognition problem in a dynamic environment; univariate cooperative classifiers that are based on statistical feature…
A Framework for Categorize Feature Selection Algorithms for Classification and Clustering
- Computer ScienceBulletin de la Société Royale des Sciences de Liège
- 2016
A categorizing framework consisting of the procedures of finding selected subsets, including Search-based procedures and non-search based, evaluation criteria and data mining tasks will be completed and developed.
Genetic Algorithms for Classification and Feature Extraction
- Computer Science
- 2016
Two GA-based approaches are developed, both utilizing a feedback linkage between feature evaluation and classification, and combine a GA with two different approaches: the KNearest-Neighbor decision rule and a production decision rule.
FEATURE SELECTION FOR CLASSIFICATION BY USING A GA-BASED NEURAL NETWORK APPROACH
- Computer Science
- 2006
The genetic algorithm optimizes a feature vector by removing both irrelevant and redundant features and finds optimal ones and the results suggest that GA based neural classifiers are robust and effective in finding optimal subsets of features from large data sets.
Selection of the optimal prototype subset for 1-NN classification
- Computer SciencePattern Recognit. Lett.
- 1998
References
SHOWING 1-10 OF 38 REFERENCES
A Nonparametric Partitioning Procedure for Pattern Classification
- Computer ScienceIEEE Transactions on Computers
- 1969
The algorithm gives an indication as to the effectiveness of various transgeneration units and hence can also be used in an interactive manner if so desired for the actual design of a classification structure.
Locally Trained Piecewise Linear Classifiers
- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 1980
A versatile technique for designing computer algorithms for separating multiple-dimensional data (feature vectors) into two classes, referred to as classifiers, that achieve nearly Bayes-minimum error rates while requiring relatively small amounts of memory.
Training a One-Dimensional Classifier to Minimize the Probability of Error
- Mathematics, Computer ScienceIEEE Trans. Syst. Man Cybern.
- 1972
This work derives a modification of the Robbins-Monro method of stochastic approximation, and shows how this modification leads to training procedures that minimize the probability of error of a one-dimensional two-category pattern classifier.
The Detection and Segmentation of Blobs in Infrared Images
- PhysicsIEEE Transactions on Systems, Man, and Cybernetics
- 1981
A computer procedure for detecting and finding the boundaries of blobs in noisy infrared images is described, which resulted in only two false negatives and no false detections on a data base of 81 targets.
A Classifier Design Technique for Discrete Variable Pattern Recognition Problems
- Computer ScienceIEEE Transactions on Computers
- 1974
A new computerized technique to aid the designers of pattern classifiers when the measurement variables are discrete and the values form a simple nominal scale (no inherent metric).
A Branch and Bound Algorithm for Feature Subset Selection
- Computer ScienceIEEE Transactions on Computers
- 1977
A feature subset selection algorithm based on branch and bound techniques is developed to select the best subset of m features from an n-feature set with the computational effort of evaluating only 6000 subsets.
Considerations of sample and feature size
- Computer ScienceIEEE Trans. Inf. Theory
- 1972
The design-set error rate for a two-class problem with multivariate normal distributions is derived as a function of the sample size per class (N) and dimensionality (L) and is demonstrated to be an extremely biased estimate of either the Bayes or test- set error rate.
A Recursive Partitioning Decision Rule for Nonparametric Classification
- Computer ScienceIEEE Transactions on Computers
- 1977
A new criterion for deriving a recursive partitioning decision rule for nonparametric classification is presented and the resulting decision rule is asymptotically Bayes' risk efficient.
Pattern classification and scene analysis
- Computer ScienceA Wiley-Interscience publication
- 1973
The topics treated include Bayesian decision theory, supervised and unsupervised learning, nonparametric techniques, discriminant analysis, clustering, preprosessing of pictorial data, spatial filtering, shape description techniques, perspective transformations, projective invariants, linguistic procedures, and artificial intelligence techniques for scene analysis.
On dimensionality and sample size in statistical pattern classification
- Computer SciencePattern Recognit.
- 1971