# Quadratic Discriminant Analysis for High-Dimensional Data

@article{Wu2019QuadraticDA, title={Quadratic Discriminant Analysis for High-Dimensional Data}, author={Yilei Wu and Yingli Qin and Mu Zhu}, journal={Statistica Sinica}, year={2019} }

High-dimensional classification is an important and challenging statistical problem. We develop a set of quadratic discriminant rules by simplifying the structure of the covariance matrices instead of imposing sparsity assumptions — either on the covariance matrices themselves (or their inverses), or on the standardized between-class distance. Under moderate conditions on the population covariance matrices, our specialized quadratic discriminant rules enjoy good asymptotic properties…

## 9 Citations

### Sparse quadratic classification rules via linear dimension reduction

- Computer ScienceJ. Multivar. Anal.
- 2019

### A review of quadratic discriminant analysis for high‐dimensional data

- Computer Science
- 2018

The challenges, some existing works, and possibly several future directions with regard to high‐dimensional QDA are to be discussed.

### Quadratic Discriminant Analysis under Moderate Dimension

- Mathematics
- 2018

Quadratic discriminant analysis (QDA) is a simple method to classify a subject into two populations, and was proven to perform as well as the Bayes rule when the data dimension p is fixed. The main…

### Phase Transitions for High-Dimensional Quadratic Discriminant Analysis with Rare and Weak Signals

- Mathematics, Computer Science
- 2021

The results suggest that the quadratic term has major influence over the LDA for the classification decision and classification accuracy, especially when μk and Ωk are both known and when all the classifiers will fail.

### High‐dimensional covariance matrix estimation using a low‐rank and diagonal decomposition

- Mathematics, Computer ScienceCanadian Journal of Statistics
- 2019

A block‐wise coordinate descent algorithm, which iteratively updates L and D , is then proposed to obtain the estimator in practice and it is shown that the method can be applied to obtain enhanced solutions to the Markowitz portfolio selection problem.

### ST ] 2 9 A ug 2 01 8 Quadratic Discriminant Analysis under Moderate Dimension

- Mathematics
- 2018

Quadratic discriminant analysis (QDA) is a simple method to classify a subject into two populations, and was proven to perform as well as the Bayes rule when the data dimension p is fixed. The main…

### An Auto-Contouring Method for Kidney based on SVM

- Computer Science
- 2021

The experiment showed that, the automatic contour method based on support vector machine has better classification performance than most classification algorithms.

### Classification in High Dimension Using the Ledoit–Wolf Shrinkage Method

- Computer ScienceMathematics
- 2022

The Stein-type shrinkage estimation of Ledoit and Wolf is employed for high-dimensional data classification and its efficiency is numerically compared to existing methods, including LDA, cross-validation, gLasso, and SVM.

## References

SHOWING 1-10 OF 28 REFERENCES

### High-Dimensional Quadratic Classifiers in Non-sparse Settings

- Computer Science, MathematicsMethodology and Computing in Applied Probability
- 2018

The quadratic classifiers proposed in this paper show that they hold a consistency property in which misclassification rates tend to zero as the dimension goes to infinity under non-sparse settings.

### SPARSE QUADRATIC DISCRIMINANT ANALYSIS FOR HIGH DIMENSIONAL DATA

- Mathematics
- 2015

Many contemporary studies involve the classification of a subject into two classes based on n observations of the p variables associated with the subject. Under the assumption that the variables are…

### Regularized Discriminant Analysis

- Computer Science
- 1989

Alternatives to the usual maximum likelihood estimates for the covariance matrices are proposed, characterized by two parameters, the values of which are customized to individual situations by jointly minimizing a sample-based estimate of future misclassification risk.

### Sparsifying the Fisher linear discriminant by rotation

- Computer ScienceJournal of the Royal Statistical Society. Series B, Statistical methodology
- 2015

It is shown that a family of rotations do create the sparsity that is needed for high dimensional classifications and theoretical understanding why such a rotation works empirically is provided.

### Sparse semiparametric discriminant analysis

- Computer ScienceJ. Multivar. Anal.
- 2015

### A Direct Estimation Approach to Sparse Linear Discriminant Analysis

- Computer Science
- 2011

A simple and effective classifier is introduced by estimating the product Ωδ directly through constrained ℓ1 minimization and it has superior finite sample performance and significant computational advantages over the existing methods that require separate estimation of Ω and δ.

### OPTIMAL CLASSIFICATION IN SPARSE GAUSSIAN GRAPHIC MODEL

- Computer Science
- 2013

This work proposes a two-stage classification method where features are first selected by the method of Innovated Thresholding (IT), and then use the retained features and Fisher's LDA for classification, adapting the recent innovation of Higher Criticism Th thresholding.

### Penalized classification using Fisher's linear discriminant

- Computer ScienceJournal of the Royal Statistical Society. Series B, Statistical methodology
- 2011

This work proposes penalized LDA, which is a general approach for penalizing the discriminant vectors in Fisher's discriminant problem in a way that leads to greater interpretability, and uses a minorization–maximization approach to optimize it efficiently when convex penalties are applied to the discriminating vectors.

### A review of discriminant analysis in high dimensions

- Computer Science
- 2013

A brief description of difficulties in extending LDA is provided, some successful proposals are presented, and various theoretical results, algorithms, and empirical results support the application of these methods.

### High Dimensional Classification Using Features Annealed Independence Rules.

- Computer ScienceAnnals of statistics
- 2008

The conditions under which all the important features can be selected by the two-sample t-statistic are established and the choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error.