A Compressive Classification Framework for High-Dimensional Data

@article{Tabassum2020ACC,
  title={A Compressive Classification Framework for High-Dimensional Data},
  author={Muhammad Naveed Tabassum and Esa Ollila},
  journal={IEEE Open Journal of Signal Processing},
  year={2020},
  volume={1},
  pages={177-186}
}
We propose a compressive classification framework for settings where the data dimensionality is significantly larger than the sample size. The proposed method, referred to as compressive regularized discriminant analysis (CRDA), is based on linear discriminant analysis and has the ability to select significant features by using joint-sparsity promoting hard thresholding in the discriminant rule. Since the number of features is larger than the sample size, the method also uses state-of-the-art… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 37 REFERENCES
Compressive Regularized Discriminant Analysis of High-Dimensional Data with Applications to Microarray Studies
TLDR
Overall, the proposed method gives fewer misclassification errors than its competitors, while at the same time achieving accurate feature selection.
Penalized classification using Fisher's linear discriminant
  • D. Witten, R. Tibshirani
  • Computer Science
    Journal of the Royal Statistical Society. Series B, Statistical methodology
  • 2011
TLDR
This work proposes penalized LDA, which is a general approach for penalizing the discriminant vectors in Fisher's discriminant problem in a way that leads to greater interpretability, and uses a minorization–maximization approach to optimize it efficiently when convex penalties are applied to the discriminating vectors.
Regularized linear discriminant analysis and its application in microarrays.
TLDR
Through both simulated data and real life data, it is shown that this method performs very well in multivariate classification problems, often outperforms the PAM method and can be as competitive as the support vector machines classifiers.
A New Reduced-Rank Linear Discriminant Analysis Method and Its Applications
TLDR
A new dimension reduction tool with a flavor of supervised principal component analysis (PCA) is introduced, which is computationally efficient and can incorporate the correlation structure among the features.
High dimensional classification with combined adaptive sparse PLS and logistic regression
TLDR
A computationally stable and convergent approach for classification in high dimensional based on sparse Partial Least Squares (sparse PLS) is proposed, which combines iterative optimization of logistic regression and sparse PLS to ensure computational convergence and stability.
Shrinkage Algorithms for MMSE Covariance Estimation
TLDR
This work improves on the Ledoit-Wolf method by conditioning on a sufficient statistic, and proposes an iterative approach which approximates the clairvoyant shrinkage estimator, referred to as the oracle approximating shrinkage (OAS) estimator.
Optimal Shrinkage Covariance Matrix Estimation Under Random Sampling From Elliptical Distributions
TLDR
A regularized sample covariance matrix (RSCM) estimator which can be applied in commonly occurring high-dimensional data problems and which often shows a significant improvement over the conventional RSCM estimator by Ledoit and Wolf (2004).
Generalized Robust Shrinkage Estimator and Its Application to STAP Detection Problem
TLDR
By analyzing this solution, called the generalized robust shrinkage estimator, it is proved that this solution converges to a unique solution when the shrinkage parameter (losing factor) tends to 0.
Optimal high-dimensional shrinkage covariance estimation for elliptical distributions
  • E. Ollila
  • Mathematics, Computer Science
    2017 25th European Signal Processing Conference (EUSIPCO)
  • 2017
TLDR
The proposed shrinkage SCM estimator often provides significantly better performance than the Ledoit-Wolf estimator and has the advantage that consistency is guaranteed over the whole class of elliptical distributions with finite 4th order moments.
...
...