# Compressive Classification of a Mixture of Gaussians: Analysis, Designs and Geometrical Interpretation

@article{Reboredo2014CompressiveCO, title={Compressive Classification of a Mixture of Gaussians: Analysis, Designs and Geometrical Interpretation}, author={Hugo Reboredo and Francesco Renna and A. Robert Calderbank and Miguel R. D. Rodrigues}, journal={ArXiv}, year={2014}, volume={abs/1401.6962} }

This paper derives fundamental limits on the performance of compressive classification when the source is a mixture of Gaussians. It provides an asymptotic analysis of a Bhattacharya based upper bound on the misclassification probability for the optimal Maximum-A-Posteriori (MAP) classifier that depends on quantities that are dual to the concepts of diversity-order and coding gain in multi-antenna communications. The diversity-order of the measurement system determines the rate at which the…

## 19 Citations

### Projections designs for compressive classification

- Computer Science, Mathematics2013 IEEE Global Conference on Signal and Information Processing
- 2013

This paper capitalize on the asymptotic characterization of the behavior of an (upper bound to the) misclassification probability associated with the optimal Maximum-A-Posteriori classifier to construct measurement designs that maximize the diversity-order of the measurement model.

### Mismatch in the Classification of Linear Subspaces: Sufficient Conditions for Reliable Classification

- Computer Science, MathematicsIEEE Transactions on Signal Processing
- 2016

Numerical results demonstrate that the conditions for reliable classification can sharply predict the behavior of a mismatched classifier both with synthetic data and in a motion segmentation and a hand-written digit classification applications.

### Compressive Classification: Where Wireless Communications Meets Machine Learning

- Computer Science
- 2015

Shannon-inspired performance limits associated with the classification of low-dimensional subspaces embedded in a high-dimensional ambient space from compressive and noisy measurements are introduced and theory aligns with practice in a concrete application: face recognition from a set of noisy compressive measurements.

### Classification and reconstruction of compressed GMM signals with side information

- Computer Science2015 IEEE International Symposium on Information Theory (ISIT)
- 2015

This paper offers a characterization of performance limits for classification and reconstruction of high-dimensional signals from noisy compressive measurements, in the presence of side information.…

### Discrimination on the grassmann manifold: Fundamental limits of subspace classifiers

- Computer Science2014 IEEE International Symposium on Information Theory
- 2014

Repurposing tools and intuitions from Shannon theory, we derive fundamental limits on the reliable classification of high-dimensional signals from low-dimensional features. We focus on the…

### Classification and Reconstruction of High-Dimensional Signals From Low-Dimensional Features in the Presence of Side Information

- Computer ScienceIEEE Transactions on Information Theory
- 2016

The framework, which offers a principled mechanism to integrate side information in high-dimensional data problems, is tested in the context of imaging applications and reports state-of-theart results in compressive hyperspectral imaging applications.

### The Role of Principal Angles in Subspace Classification

- Computer ScienceIEEE Transactions on Signal Processing
- 2016

The transform presented here (TRAIT) preserves some specific characteristic of each individual class, and this approach is shown to be complementary to a previously developed transform (LRT) that enlarges inter-class distance while suppressing intraclass dispersion.

### A general framework for reconstruction and classification from compressive measurements with side information

- Computer Science2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2016

A general framework for compressive linear-projection measurements with side information is developed and the presence ofside information is shown to yield improved performance, both theoretically and experimentally.

### Mismatch in the Classiﬁcation of Linear Subspaces: Sufﬁcient Conditions for Reliable Classiﬁcation

- Computer Science, Mathematics
- 2022

Numerical results demonstrate that the conditions for reliable classiﬁcation in the low-noise regime can sharply predict the behavior of a mismatched classi ﬁer both with synthetic data and in a motion segmentation and a hand-written digit classi-cation applications.

### Source Separation With Side Information Based on Gaussian Mixture Models With Application in Art Investigation

- Computer ScienceIEEE Transactions on Signal Processing
- 2020

This paper proposes an algorithm for source separation with side information where one observes the linear superposition of two source signals plus two additional signals that are correlated with the mixed ones and describes necessary and sufficient conditions for reliable source separation in the asymptotic regime of low-noise.

## References

SHOWING 1-10 OF 85 REFERENCES

### Information-theoretic limits on the classification of Gaussian mixtures: Classification on the Grassmann manifold

- Computer Science2013 IEEE Information Theory Workshop (ITW)
- 2013

This work defines the classification capacity, which quantifies the maximum number of classes that can be discriminated with low probability of error, and the diversity-discrimination tradeoff, and identifies a duality between classification and communications over non-coherent multiple-antenna channels.

### Reconstruction of Signals Drawn From a Gaussian Mixture Via Noisy Compressive Measurements

- Computer ScienceIEEE Transactions on Signal Processing
- 2014

Borders are tighter and sharper than standard bounds on the minimum number of measurements needed to recover sparse signals associated with a union of subspaces model, as they are not asymptotic in the signal dimension or signal sparsity.

### Reconstruction of Signals Drawn from a Gaussian Mixture from Noisy Compressive Measurements: MMSE Phase Transitions and Beyond

- Computer ScienceArXiv
- 2013

The method not only reveals the existence or absence of a minimum mean-squared error (MMSE) error floor (phase transition) but also provides insight into theMMSE decay via multivariate generalizations of the MMSE dimension and the MM SE power offset that are a function of the interaction between the geometrical properties of the kernel and the Gaussian mixture.

### Performance Limits of Compressive Sensing-Based Signal Classification

- Computer ScienceIEEE Transactions on Signal Processing
- 2012

Performance limits of classification of sparse as well as not necessarily sparse signals based on compressive measurements are provided and it is shown that Kullback-Leibler and Chernoff distances between two probability density functions under any two hypotheses are preserved up to a factor of M/N.

### Statistical Compressed Sensing of Gaussian Mixture Models

- Computer ScienceIEEE Transactions on Signal Processing
- 2011

In real image sensing applications, GMM-based SCS is shown to lead to improved results compared to conventional CS, at a considerably lower computational cost.

### Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting

- Computer ScienceIEEE Transactions on Information Theory
- 2009

For a noisy linear observation model based on random measurement matrices drawn from general Gaussian measurementMatrices, this paper derives both a set of sufficient conditions for exact support recovery using an exhaustive search decoder, as well as aset of necessary conditions that any decoder must satisfy for exactSupport set recovery.

### The Sampling Rate-Distortion Tradeoff for Sparsity Pattern Recovery in Compressed Sensing

- Computer ScienceIEEE Transactions on Information Theory
- 2012

It is shown that recovery with an arbitrarily small but constant fraction of errors is, however, possible, and that in some cases computationally simple estimators are near-optimal.

### Shannon-Theoretic Limits on Noisy Compressive Sampling

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2010

It is proved that O(L) (an asymptotically linear multiple of L) measurements are necessary and sufficient for signal recovery, whenever L grows linearly as a function of M.

### The smashed filter for compressive classification and target recognition

- Computer ScienceElectronic Imaging
- 2007

A framework for compressive classification that operates directly on the compressive measurements without first reconstructing the image is proposed, and the effectiveness of the smashed filter for target classification using very few measurements is demonstrated.

### Necessary and Sufficient Conditions for Sparsity Pattern Recovery

- Computer ScienceIEEE Transactions on Information Theory
- 2009

A new necessary condition on the number of measurements for asymptotically reliable detection with maximum-likelihood (ML) estimation and Gaussian measurement matrices is derived and shown that the gap between thresholding and ML can be described by a simple expression in terms of the total signal-to-noise ratio (SNR).