Quadratic Discriminant Analysis for High-Dimensional Data

@article{Wu2019QuadraticDA,
  title={Quadratic Discriminant Analysis for High-Dimensional Data},
  author={Yilei Wu and Yingli Qin and Mu Zhu},
  journal={Statistica Sinica},
  year={2019}
}
High-dimensional classification is an important and challenging statistical problem. We develop a set of quadratic discriminant rules by simplifying the structure of the covariance matrices instead of imposing sparsity assumptions — either on the covariance matrices themselves (or their inverses), or on the standardized between-class distance. Under moderate conditions on the population covariance matrices, our specialized quadratic discriminant rules enjoy good asymptotic properties… 

Figures and Tables from this paper

Sparse quadratic classification rules via linear dimension reduction

A review of quadratic discriminant analysis for high‐dimensional data

The challenges, some existing works, and possibly several future directions with regard to high‐dimensional QDA are to be discussed.

Quadratic Discriminant Analysis under Moderate Dimension

Quadratic discriminant analysis (QDA) is a simple method to classify a subject into two populations, and was proven to perform as well as the Bayes rule when the data dimension p is fixed. The main

Classification in High Dimension Using the Ledoit–Wolf Shrinkage Method

The Stein-type shrinkage estimation of Ledoit and Wolf is employed for high-dimensional data classification and its efficiency is numerically compared to existing methods, including LDA, cross-validation, gLasso, and SVM.

Quadratic discriminant analysis by projection

High‐dimensional covariance matrix estimation using a low‐rank and diagonal decomposition

A block‐wise coordinate descent algorithm, which iteratively updates L and D , is then proposed to obtain the estimator in practice and it is shown that the method can be applied to obtain enhanced solutions to the Markowitz portfolio selection problem.

ST ] 2 9 A ug 2 01 8 Quadratic Discriminant Analysis under Moderate Dimension

Quadratic discriminant analysis (QDA) is a simple method to classify a subject into two populations, and was proven to perform as well as the Bayes rule when the data dimension p is fixed. The main

An Auto-Contouring Method for Kidney based on SVM

The experiment showed that, the automatic contour method based on support vector machine has better classification performance than most classification algorithms.

Phase Transitions for High-Dimensional Quadratic Discriminant Analysis with Rare and Weak Signals

The results suggest that the quadratic term has major influence over the LDA for the classification decision and classification accuracy, especially when μk and Ωk are both known and when all the classifiers will fail.

References

SHOWING 1-10 OF 28 REFERENCES

High-Dimensional Quadratic Classifiers in Non-sparse Settings

  • M. AoshimaK. Yata
  • Computer Science, Mathematics
    Methodology and Computing in Applied Probability
  • 2018
The quadratic classifiers proposed in this paper show that they hold a consistency property in which misclassification rates tend to zero as the dimension goes to infinity under non-sparse settings.

SPARSE QUADRATIC DISCRIMINANT ANALYSIS FOR HIGH DIMENSIONAL DATA

Many contemporary studies involve the classification of a subject into two classes based on n observations of the p variables associated with the subject. Under the assumption that the variables are

Regularized Discriminant Analysis

Alternatives to the usual maximum likelihood estimates for the covariance matrices are proposed, characterized by two parameters, the values of which are customized to individual situations by jointly minimizing a sample-based estimate of future misclassification risk.

A direct approach to sparse discriminant analysis in ultra-high dimensions

The theory shows that the method proposed can consistently identify the subset of discriminative features contributing to the Bayes rule and at the same time consistently estimate theBayes classification direction, even when the dimension can grow faster than any polynomial order of the sample size.

Sparsifying the Fisher linear discriminant by rotation

It is shown that a family of rotations do create the sparsity that is needed for high dimensional classifications and theoretical understanding why such a rotation works empirically is provided.

Sparse semiparametric discriminant analysis

A Direct Estimation Approach to Sparse Linear Discriminant Analysis

A simple and effective classifier is introduced by estimating the product Ωδ directly through constrained ℓ1 minimization and it has superior finite sample performance and significant computational advantages over the existing methods that require separate estimation of Ω and δ.

OPTIMAL CLASSIFICATION IN SPARSE GAUSSIAN GRAPHIC MODEL

This work proposes a two-stage classification method where features are first selected by the method of Innovated Thresholding (IT), and then use the retained features and Fisher's LDA for classification, adapting the recent innovation of Higher Criticism Th thresholding.

Penalized classification using Fisher's linear discriminant

  • D. WittenR. Tibshirani
  • Computer Science
    Journal of the Royal Statistical Society. Series B, Statistical methodology
  • 2011
This work proposes penalized LDA, which is a general approach for penalizing the discriminant vectors in Fisher's discriminant problem in a way that leads to greater interpretability, and uses a minorization–maximization approach to optimize it efficiently when convex penalties are applied to the discriminating vectors.

A review of discriminant analysis in high dimensions

A brief description of difficulties in extending LDA is provided, some successful proposals are presented, and various theoretical results, algorithms, and empirical results support the application of these methods.