Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria

@article{Loog2001MulticlassLD,
  title={Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria},
  author={M. Loog and Robert P. W. Duin and Reinhold H{\"a}b-Umbach},
  journal={IEEE Trans. Pattern Anal. Mach. Intell.},
  year={2001},
  volume={23},
  pages={762-766}
}
We derive a class of computationally inexpensive linear dimension reduction criteria by introducing a weighted variant of the well-known K-class Fisher criterion associated with linear discriminant analysis (LDA). It can be seen that LDA weights contributions of individual class pairs according to the Euclidean distance of the respective class means. We generalize upon LDA by introducing a different weighting function. 

Figures from this paper

Multi-Class Classification Based on Fisher Criteria with Weighted Distance

  • Meng AoS.Z. Li
  • Computer Science
    2008 Chinese Conference on Pattern Recognition
  • 2008
This paper proposes a new Fisher criteria with weighted distance (FCWWD) to find an optimal projection for multi-class classification tasks and replaces the classical linear function with a nonlinear weight function to describe the distances between samples in Fisher criteria.

Linear dimensionality reduction using relevance weighted LDA

Weighted Additive Criterion for Linear Dimension Reduction

  • Jing PengS. Robila
  • Computer Science
    Seventh IEEE International Conference on Data Mining (ICDM 2007)
  • 2007
This paper proposes a simple weighted criterion for linear dimension reduction that addresses the above two problems associated with LDA and demonstrates the efficacy of the proposal and compares it against other competing techniques.

Generalized null space uncorrelated Fisher discriminant analysis for linear dimensionality reduction

Enhanced Direct Linear Discriminant Analysis for Feature Extraction on High Dimensional Data

The EDLDA integrates two types of class-wise weighting terms in estimating the average within-class and between-class scatter matrices in order to relate the resulting Fisher criterion more closely to the minimization of classification error.

A linear discriminant analysis using weighted local structure information

A new weighted LDA is proposed to improve the performance of the discriminant analysis and improves the minimization of the within-class scatter, but also improves the maximizations of the between classes scatter to extract better discriminant feature subset.

Nonparametric Dimension Reduction via Maximizing Pairwise Separation Probability

A novel nonparametric supervised linear dimension reduction (SLDR) algorithm that extracts the features by maximizing the pairwise separation probability, which describes the generalization accuracy when the obtained features are used to train a linear classifier.

Dimension Reduction by an Orthogonal Series Estimate of the Probabilistic Dependence Measure

A new estimate of the L probabilistic dependence measure by Fourier series for 2-dimensional reduction is introduced and its performance is compared to the Fischer Linear Discriminate Analysis and the Approximate Chernoff Criterion in the mean of classification probability error.
...

References

SHOWING 1-10 OF 29 REFERENCES

Discriminant Analysis by Gaussian Mixtures

This paper fits Gaussian mixtures to each class to facilitate effective classification in non-normal settings, especially when the classes are clustered.

Multi-class linear feature extraction by nonlinear PCA

A novel, equally fast method, based on nonlinear principal component analysis (PCA), which may avoid the class conjunction and is experimentally compared with Fisher's mapping and with a neural network based approach to nonlinear PCA.

Approximate Pairwise Accuracy Criteria for Multiclass Linear Dimension Reduction: Generalisations of the Fisher Criterion

The approximate pairwise accuracy criteria for multiclass linear dimension reduction generalisations of the fisher criterion wbbm report series 44 that we provide for you will be ultimate to give

Toward Bayes-Optimal Linear Dimension Reduction

  • L. Buturovic
  • Computer Science
    IEEE Trans. Pattern Anal. Mach. Intell.
  • 1994
This work proposes an alternative criterion, based on the estimate of the Bayes error, that is hopefully closer to the optimal criterion than the criteria currently in use.

CANONICAL VARIATE ANALYSIS—A GENERAL MODEL FORMULATION

Summary A general model, specifying the population means as a function of the population canonical vectors, provides a natural basis for considering many aspects of canonical variate analysis.

The nonlinear PCA learning rule in independent component analysis

  • E. Oja
  • Computer Science
    Neurocomputing
  • 1997

Machine Learning, Neural and Statistical Classification

Survey of previous comparisons and theoretical work descriptions of methods dataset descriptions criteria for comparison and methodology (including validation) empirical results machine learning on

Machine learning

Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.