• Corpus ID: 18367633

Compressive Classification of a Mixture of Gaussians: Analysis, Designs and Geometrical Interpretation

@article{Reboredo2014CompressiveCO,
  title={Compressive Classification of a Mixture of Gaussians: Analysis, Designs and Geometrical Interpretation},
  author={Hugo Reboredo and Francesco Renna and A. Robert Calderbank and Miguel R. D. Rodrigues},
  journal={ArXiv},
  year={2014},
  volume={abs/1401.6962}
}
This paper derives fundamental limits on the performance of compressive classification when the source is a mixture of Gaussians. It provides an asymptotic analysis of a Bhattacharya based upper bound on the misclassification probability for the optimal Maximum-A-Posteriori (MAP) classifier that depends on quantities that are dual to the concepts of diversity-order and coding gain in multi-antenna communications. The diversity-order of the measurement system determines the rate at which the… 

Figures and Tables from this paper

Projections designs for compressive classification

This paper capitalize on the asymptotic characterization of the behavior of an (upper bound to the) misclassification probability associated with the optimal Maximum-A-Posteriori classifier to construct measurement designs that maximize the diversity-order of the measurement model.

Mismatch in the Classification of Linear Subspaces: Sufficient Conditions for Reliable Classification

Numerical results demonstrate that the conditions for reliable classification can sharply predict the behavior of a mismatched classifier both with synthetic data and in a motion segmentation and a hand-written digit classification applications.

Compressive Classification: Where Wireless Communications Meets Machine Learning

Shannon-inspired performance limits associated with the classification of low-dimensional subspaces embedded in a high-dimensional ambient space from compressive and noisy measurements are introduced and theory aligns with practice in a concrete application: face recognition from a set of noisy compressive measurements.

Classification and reconstruction of compressed GMM signals with side information

This paper offers a characterization of performance limits for classification and reconstruction of high-dimensional signals from noisy compressive measurements, in the presence of side information.

Discrimination on the grassmann manifold: Fundamental limits of subspace classifiers

Repurposing tools and intuitions from Shannon theory, we derive fundamental limits on the reliable classification of high-dimensional signals from low-dimensional features. We focus on the

Classification and Reconstruction of High-Dimensional Signals From Low-Dimensional Features in the Presence of Side Information

The framework, which offers a principled mechanism to integrate side information in high-dimensional data problems, is tested in the context of imaging applications and reports state-of-theart results in compressive hyperspectral imaging applications.

The Role of Principal Angles in Subspace Classification

The transform presented here (TRAIT) preserves some specific characteristic of each individual class, and this approach is shown to be complementary to a previously developed transform (LRT) that enlarges inter-class distance while suppressing intraclass dispersion.

A general framework for reconstruction and classification from compressive measurements with side information

A general framework for compressive linear-projection measurements with side information is developed and the presence ofside information is shown to yield improved performance, both theoretically and experimentally.

Mismatch in the Classification of Linear Subspaces: Sufficient Conditions for Reliable Classification

Numerical results demonstrate that the conditions for reliable classification in the low-noise regime can sharply predict the behavior of a mismatched classi fier both with synthetic data and in a motion segmentation and a hand-written digit classi-cation applications.

Source Separation With Side Information Based on Gaussian Mixture Models With Application in Art Investigation

This paper proposes an algorithm for source separation with side information where one observes the linear superposition of two source signals plus two additional signals that are correlated with the mixed ones and describes necessary and sufficient conditions for reliable source separation in the asymptotic regime of low-noise.

References

SHOWING 1-10 OF 85 REFERENCES

Information-theoretic limits on the classification of Gaussian mixtures: Classification on the Grassmann manifold

This work defines the classification capacity, which quantifies the maximum number of classes that can be discriminated with low probability of error, and the diversity-discrimination tradeoff, and identifies a duality between classification and communications over non-coherent multiple-antenna channels.

Reconstruction of Signals Drawn From a Gaussian Mixture Via Noisy Compressive Measurements

Borders are tighter and sharper than standard bounds on the minimum number of measurements needed to recover sparse signals associated with a union of subspaces model, as they are not asymptotic in the signal dimension or signal sparsity.

Reconstruction of Signals Drawn from a Gaussian Mixture from Noisy Compressive Measurements: MMSE Phase Transitions and Beyond

The method not only reveals the existence or absence of a minimum mean-squared error (MMSE) error floor (phase transition) but also provides insight into theMMSE decay via multivariate generalizations of the MMSE dimension and the MM SE power offset that are a function of the interaction between the geometrical properties of the kernel and the Gaussian mixture.

Performance Limits of Compressive Sensing-Based Signal Classification

Performance limits of classification of sparse as well as not necessarily sparse signals based on compressive measurements are provided and it is shown that Kullback-Leibler and Chernoff distances between two probability density functions under any two hypotheses are preserved up to a factor of M/N.

Statistical Compressed Sensing of Gaussian Mixture Models

In real image sensing applications, GMM-based SCS is shown to lead to improved results compared to conventional CS, at a considerably lower computational cost.

Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting

  • M. Wainwright
  • Computer Science
    IEEE Transactions on Information Theory
  • 2009
For a noisy linear observation model based on random measurement matrices drawn from general Gaussian measurementMatrices, this paper derives both a set of sufficient conditions for exact support recovery using an exhaustive search decoder, as well as aset of necessary conditions that any decoder must satisfy for exactSupport set recovery.

The Sampling Rate-Distortion Tradeoff for Sparsity Pattern Recovery in Compressed Sensing

It is shown that recovery with an arbitrarily small but constant fraction of errors is, however, possible, and that in some cases computationally simple estimators are near-optimal.

Shannon-Theoretic Limits on Noisy Compressive Sampling

It is proved that O(L) (an asymptotically linear multiple of L) measurements are necessary and sufficient for signal recovery, whenever L grows linearly as a function of M.

The smashed filter for compressive classification and target recognition

A framework for compressive classification that operates directly on the compressive measurements without first reconstructing the image is proposed, and the effectiveness of the smashed filter for target classification using very few measurements is demonstrated.

Necessary and Sufficient Conditions for Sparsity Pattern Recovery

A new necessary condition on the number of measurements for asymptotically reliable detection with maximum-likelihood (ML) estimation and Gaussian measurement matrices is derived and shown that the gap between thresholding and ML can be described by a simple expression in terms of the total signal-to-noise ratio (SNR).
...