Corpus ID: 88514571

High-Dimensional Regularized Discriminant Analysis

  title={High-Dimensional Regularized Discriminant Analysis},
  author={John A. Ramey and Caleb K. Stein and Phil D. Young and Dean M. Young},
  journal={arXiv: Machine Learning},
Regularized discriminant analysis (RDA), proposed by Friedman (1989), is a widely popular classifier that lacks interpretability and is impractical for high-dimensional data sets. Here, we present an interpretable and computationally efficient classifier called high-dimensional RDA (HDRDA), designed for the small-sample, high-dimensional setting. For HDRDA, we show that each training observation, regardless of class, contributes to the class covariance matrix, resulting in an interpretable… 

Figures and Tables from this paper

Sparse quadratic classification rules via linear dimension reduction
This work proposes to perform simultaneous variable selection and linear dimension reduction on the original data, with the subsequent application of quadratic discriminant analysis on the reduced space, and confirms the crucial importance of the ESR1 gene in differentiating estrogen receptor status.
Better-than-chance classification for signal detection.
It is found that the probability of detecting differences between two distributions is lower for accuracy-based statistics, and is suggested to replace V-fold cross-validation with the Leave-One-Out Bootstrap.


Regularized discriminant analysis for high dimensional, low sample size data
A novel algorithm for RDA is presented for high dimensional data that can estimate the optimal regularization parameters from a large set of parameter candidates efficiently and performs favorably in classification, in comparison with other existing classification methods.
A direct approach to sparse discriminant analysis in ultra-high dimensions
Sparse discriminant methods based on independence rules, such as the nearest shrunken centroids classifier (Tibshirani et al., 2002) and features annealed independence rules (Fan & Fan, 2008), have
Penalized classification using Fisher's linear discriminant.
  • D. Witten, R. Tibshirani
  • Mathematics, Medicine
    Journal of the Royal Statistical Society. Series B, Statistical methodology
  • 2011
This work proposes penalized LDA, a general approach for penalizing the discriminant vectors in Fisher's discriminant problem in a way that leads to greater interpretability, and uses a minorization-maximization approach in order to efficiently optimize it when convex penalties are applied to the discriminating vectors.
A ROAD to Classification in High Dimensional Space.
  • Jianqing Fan, Yang Feng, X. Tong
  • Mathematics, Computer Science
    Journal of the Royal Statistical Society. Series B, Statistical methodology
  • 2012
Simulation studies and real data analysis support the theoretical results and demonstrate the advantages of the new classification procedure under a variety of correlation structures.
Computational and Theoretical Analysis of Null Space and Orthogonal Linear Discriminant Analysis
The main result shows that under a mild condition which holds in many applications involving high-dimensional data, NLDA is equivalent to OLDA, which confirms the effectiveness of the regularization in ROLDA.
High-Dimensional Discriminant Analysis
We propose a new discriminant analysis method for high-dimensional data, called High-Dimensional Discriminant Analysis (HDDA). Our approach is based on the assumption that high-dimensional data live
Regularized mixture discriminant analysis
The experimental results show that the proposed Gaussian mixture model of the class-conditional densities for plug-in Bayes classification has the potential to produce parameterizations of the covariance matrices of the GMMs which are better than the parameterizations used in other methods.
A comparison of regularization methods applied to the linear discriminant function with high-dimensional microarray data
Classification of gene expression microarray data is important in the diagnosis of diseases such as cancer, but often the analysis of microarray data presents difficult challenges because the gene
Improved mean estimation and its application to diagonal discriminant analysis
This article investigates the family of shrinkage estimators for the mean value under the quadratic loss function and proposes a shrinkage-based diagonal discriminant rule, which outperforms its original competitor in a wide range of settings.
Regularized Gaussian Discriminant Analysis through Eigenvalue Decomposition
Abstract Friedman proposed a regularization technique (RDA) of discriminant analysis in the Gaussian framework. RDA uses two regularization parameters to design an intermediate classifier between the