# Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria

@article{Loog2001MulticlassLD, title={Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria}, author={M. Loog and Robert P. W. Duin and Reinhold H{\"a}b-Umbach}, journal={IEEE Trans. Pattern Anal. Mach. Intell.}, year={2001}, volume={23}, pages={762-766} }

We derive a class of computationally inexpensive linear dimension reduction criteria by introducing a weighted variant of the well-known K-class Fisher criterion associated with linear discriminant analysis (LDA). It can be seen that LDA weights contributions of individual class pairs according to the Euclidean distance of the respective class means. We generalize upon LDA by introducing a different weighting function.

## 479 Citations

### Multi-Class Classification Based on Fisher Criteria with Weighted Distance

- Computer Science2008 Chinese Conference on Pattern Recognition
- 2008

This paper proposes a new Fisher criteria with weighted distance (FCWWD) to find an optimal projection for multi-class classification tasks and replaces the classical linear function with a nonlinear weight function to describe the distances between samples in Fisher criteria.

### Linear dimensionality reduction using relevance weighted LDA

- Computer SciencePattern Recognit.
- 2005

### Weighted Additive Criterion for Linear Dimension Reduction

- Computer ScienceSeventh IEEE International Conference on Data Mining (ICDM 2007)
- 2007

This paper proposes a simple weighted criterion for linear dimension reduction that addresses the above two problems associated with LDA and demonstrates the efficacy of the proposal and compares it against other competing techniques.

### Generalized null space uncorrelated Fisher discriminant analysis for linear dimensionality reduction

- MathematicsPattern Recognit.
- 2006

### Enhanced Direct Linear Discriminant Analysis for Feature Extraction on High Dimensional Data

- Computer ScienceAAAI
- 2005

The EDLDA integrates two types of class-wise weighting terms in estimating the average within-class and between-class scatter matrices in order to relate the resulting Fisher criterion more closely to the minimization of classification error.

### A linear discriminant analysis using weighted local structure information

- Computer Science2017 14th International Joint Conference on Computer Science and Software Engineering (JCSSE)
- 2017

A new weighted LDA is proposed to improve the performance of the discriminant analysis and improves the minimization of the within-class scatter, but also improves the maximizations of the between classes scatter to extract better discriminant feature subset.

### Nonparametric Dimension Reduction via Maximizing Pairwise Separation Probability

- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2019

A novel nonparametric supervised linear dimension reduction (SLDR) algorithm that extracts the features by maximizing the pairwise separation probability, which describes the generalization accuracy when the obtained features are used to train a linear classifier.

### On the linear discriminant analysis for large number of classes

- Computer ScienceEng. Appl. Artif. Intell.
- 2015

### Dimension Reduction by an Orthogonal Series Estimate of the Probabilistic Dependence Measure

- MathematicsICPRAM
- 2012

A new estimate of the L probabilistic dependence measure by Fourier series for 2-dimensional reduction is introduced and its performance is compared to the Fischer Linear Discriminate Analysis and the Approximate Chernoff Criterion in the mean of classification probability error.

## References

SHOWING 1-10 OF 29 REFERENCES

### Discriminant Analysis by Gaussian Mixtures

- Computer Science
- 1996

This paper fits Gaussian mixtures to each class to facilitate effective classification in non-normal settings, especially when the classes are clustered.

### Multi-class linear feature extraction by nonlinear PCA

- Computer ScienceProceedings 15th International Conference on Pattern Recognition. ICPR-2000
- 2000

A novel, equally fast method, based on nonlinear principal component analysis (PCA), which may avoid the class conjunction and is experimentally compared with Fisher's mapping and with a neural network based approach to nonlinear PCA.

### Approximate Pairwise Accuracy Criteria for Multiclass Linear Dimension Reduction: Generalisations of the Fisher Criterion

- Computer Science
- 1999

The approximate pairwise accuracy criteria for multiclass linear dimension reduction generalisations of the fisher criterion wbbm report series 44 that we provide for you will be ultimate to give…

### Toward Bayes-Optimal Linear Dimension Reduction

- Computer ScienceIEEE Trans. Pattern Anal. Mach. Intell.
- 1994

This work proposes an alternative criterion, based on the estimate of the Bayes error, that is hopefully closer to the optimal criterion than the criteria currently in use.

### Heteroscedastic discriminant analysis and reduced rank HMMs for improved speech recognition

- Computer ScienceSpeech Commun.
- 1998

### CANONICAL VARIATE ANALYSIS—A GENERAL MODEL FORMULATION

- Mathematics
- 1984

Summary
A general model, specifying the population means as a function of the population canonical vectors, provides a natural basis for considering many aspects of canonical variate analysis.…

### The nonlinear PCA learning rule in independent component analysis

- Computer ScienceNeurocomputing
- 1997

### Machine Learning, Neural and Statistical Classification

- Computer Science
- 1994

Survey of previous comparisons and theoretical work descriptions of methods dataset descriptions criteria for comparison and methodology (including validation) empirical results machine learning on…

### Machine learning

- Computer ScienceCSUR
- 1996

Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.