Feature Selection for Automatic Classification of Non-Gaussian Data

@article{Foroutan1987FeatureSF,
  title={Feature Selection for Automatic Classification of Non-Gaussian Data},
  author={Iman Foroutan and Jack Sklansky},
  journal={IEEE Transactions on Systems, Man, and Cybernetics},
  year={1987},
  volume={17},
  pages={187-198}
}
A computer-based technique for automatic selection of features for the classification of non-Gaussian data is presented. The selection technique exploits interactive cluster finding and a modified branch and bound optimization of piecewise linear classifiers. The technique first finds an efficient set of pairs of oppositely classified clusters to represent the data. Then a zero-one implicit enumeration implements a branch and bound search for a good subset of features. A test of the feature… 
Computational discovery for nonlinear classifiers
  • J. Sklansky
  • Computer Science
    [Proceedings] 1992 IEEE International Conference on Systems, Man, and Cybernetics
  • 1992
TLDR
The author describes an approach to computer-aided discovery of nonlinear classifiers and cluster finders for complex distributions of multidimensional data by combining algebraic transformations, hill climbing, and genetic search.
Feature Selection Algorithm for Multiple Classifier Systems: A Hybrid Approach
TLDR
The main aim of this paper is to present ways to improve RBFS algorithm which is a feature selection algorithm, and propose a new algorithm called ARS algorithm, which is to decrease the number of the decision-relative reducts for a decision table.
Optimum feature selection for decision functions
TLDR
An optimum feature selection method which is applicable to arbitrary (nonlinear) decision functions is presented and numerical examples of feature selection for a linear and a quadratic decision function are presented.
Designing Relevant Features for Continuous Data Sets Using ICA
TLDR
A novel approach to reducing dimensionality of the feature space by employing independent component analysis (ICA) is introduced, which efficiently builds a reduced set of features without loss in accuracy and also has a fast incremental version.
Pattern classification in dynamic environments: tagged feature-class representation and the classifiers
  • Qiuming Zhu
  • Computer Science
    IEEE Trans. Syst. Man Cybern.
  • 1989
The author discusses: a tagged feature and class representation of the pattern recognition problem in a dynamic environment; univariate cooperative classifiers that are based on statistical feature
A Framework for Categorize Feature Selection Algorithms for Classification and Clustering
TLDR
A categorizing framework consisting of the procedures of finding selected subsets, including Search-based procedures and non-search based, evaluation criteria and data mining tasks will be completed and developed.
Genetic Algorithms for Classification and Feature Extraction
TLDR
Two GA-based approaches are developed, both utilizing a feedback linkage between feature evaluation and classification, and combine a GA with two different approaches: the KNearest-Neighbor decision rule and a production decision rule.
FEATURE SELECTION FOR CLASSIFICATION BY USING A GA-BASED NEURAL NETWORK APPROACH
TLDR
The genetic algorithm optimizes a feature vector by removing both irrelevant and redundant features and finds optimal ones and the results suggest that GA based neural classifiers are robust and effective in finding optimal subsets of features from large data sets.
Selection of the optimal prototype subset for 1-NN classification
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 38 REFERENCES
A Nonparametric Partitioning Procedure for Pattern Classification
TLDR
The algorithm gives an indication as to the effectiveness of various transgeneration units and hence can also be used in an interactive manner if so desired for the actual design of a classification structure.
Locally Trained Piecewise Linear Classifiers
TLDR
A versatile technique for designing computer algorithms for separating multiple-dimensional data (feature vectors) into two classes, referred to as classifiers, that achieve nearly Bayes-minimum error rates while requiring relatively small amounts of memory.
Training a One-Dimensional Classifier to Minimize the Probability of Error
TLDR
This work derives a modification of the Robbins-Monro method of stochastic approximation, and shows how this modification leads to training procedures that minimize the probability of error of a one-dimensional two-category pattern classifier.
The Detection and Segmentation of Blobs in Infrared Images
TLDR
A computer procedure for detecting and finding the boundaries of blobs in noisy infrared images is described, which resulted in only two false negatives and no false detections on a data base of 81 targets.
A Classifier Design Technique for Discrete Variable Pattern Recognition Problems
  • J. Stoffel
  • Computer Science
    IEEE Transactions on Computers
  • 1974
TLDR
A new computerized technique to aid the designers of pattern classifiers when the measurement variables are discrete and the values form a simple nominal scale (no inherent metric).
A Branch and Bound Algorithm for Feature Subset Selection
TLDR
A feature subset selection algorithm based on branch and bound techniques is developed to select the best subset of m features from an n-feature set with the computational effort of evaluating only 6000 subsets.
Considerations of sample and feature size
TLDR
The design-set error rate for a two-class problem with multivariate normal distributions is derived as a function of the sample size per class (N) and dimensionality (L) and is demonstrated to be an extremely biased estimate of either the Bayes or test- set error rate.
A Recursive Partitioning Decision Rule for Nonparametric Classification
  • J. Friedman
  • Computer Science
    IEEE Transactions on Computers
  • 1977
TLDR
A new criterion for deriving a recursive partitioning decision rule for nonparametric classification is presented and the resulting decision rule is asymptotically Bayes' risk efficient.
Pattern classification and scene analysis
TLDR
The topics treated include Bayesian decision theory, supervised and unsupervised learning, nonparametric techniques, discriminant analysis, clustering, preprosessing of pictorial data, spatial filtering, shape description techniques, perspective transformations, projective invariants, linguistic procedures, and artificial intelligence techniques for scene analysis.
...
1
2
3
4
...