• Corpus ID: 8650707

Subspace-Sparse Representation

  title={Subspace-Sparse Representation},
  author={Chong You and Ren{\'e} Vidal},
Given an overcomplete dictionary $A$ and a signal $b$ that is a linear combination of a few linearly independent columns of $A$, classical sparse recovery theory deals with the problem of recovering the unique sparse representation $x$ such that $b = A x$. It is known that under certain conditions on $A$, $x$ can be recovered by the Basis Pursuit (BP) and the Orthogonal Matching Pursuit (OMP) algorithms. In this work, we consider the more general case where $b$ lies in a low-dimensional… 

Figures from this paper

Geometric Conditions for Subspace-Sparse Recovery
Conditions under which existing pursuit methods recover a subspace-sparse representation are studied, which reveal important geometric insights and have implications for the theory of classical sparse recovery as well as subspace clustering.
A Critique of Self-Expressive Deep Subspace Clustering
It is shown that a significant portion of the previously claimed performance benefits can be attributed to an ad-hoc post processing step rather than the clustering model, which can lead to a degenerate embedding of the data, which need not correspond to a union of subspaces at all.
Sparse Subspace Clustering-Based Feature Extraction for PolSAR Imagery Classification
This paper combines sparse representation, low-rank representation, and manifold graphs to investigate the intrinsic property of PolSar data and demonstrates that the proposed algorithms outperform other state-of-art linear and nonlinear approaches with better quantization and visualization performance in PolSAR data from spaceborne and airborne platforms.
Reverse Engineering 𝓁p attacks: A block-sparse optimization approach with recovery guarantees
This paper derives geometric conditions on the subspaces under which any attacked signal can be decomposed as the sum of a clean signal plus an attack and recovers the clean signal.
Oracle Based Active Set Algorithm for Scalable Elastic Net Subspace Clustering
This paper studies the geometry of the elastic net regularizer and uses it to derive a provably correct and scalable active set method for finding the optimal coefficients and provides a theoretical justification and a geometric interpretation for the balance between the connectedness and subspace-preserving properties for elastic net subspace clustering.
Atomic Representation-Based Classification: Theory, Algorithm, and Applications
The theoretical guarantees for a general unified framework termed as atomic representation-based classification (ARC), which includes most RC methods as special cases, are established and a new condition called atomic classification condition (ACC) is introduced, which reveals important geometric insights for the theory of ARC.


Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization
  • D. Donoho, Michael Elad
  • Computer Science
    Proceedings of the National Academy of Sciences of the United States of America
  • 2003
This article obtains parallel results in a more general setting, where the dictionary D can arise from two or several bases, frames, or even less structured systems, and sketches three applications: separating linear features from planar ones in 3D data, noncooperative multiuser encoding, and identification of over-complete independent component models.
Stable recovery of sparse overcomplete representations in the presence of noise
This paper establishes the possibility of stable recovery under a combination of sufficient sparsity and favorable structure of the overcomplete system and shows that similar stability is also available using the basis and the matching pursuit algorithms.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
The aim of this paper is to introduce a few key notions and applications connected to sparsity, targeting newcomers interested in either the mathematical aspects of this area or its applications.
Approximate Subspace-Sparse Recovery in the Presence of Corruptions via $\ell_1$-Minimization
This paper considers a constrained `1-minimization program and study conditions under which the solution of the optimization recovers a representation of a noisy point as a linear combination of a few noisy points from the same subspace, and shows that a noisy data point, close to one of the subspaces, will be reconstructed with a small error.
Greed is good: algorithmic results for sparse approximation
  • J. Tropp
  • Computer Science
    IEEE Transactions on Information Theory
  • 2004
This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries and develops a sufficient condition under which OMP can identify atoms from an optimal approximation of a nonsparse signal.
Sparse Subspace Clustering by Orthogonal Matching Pursuit
This paper proposes a new subspace clustering method based on orthogonal matching pursuit that is computationally efficient and guaranteed to provide the correct clustering for arbitrary subspaces.
Sparse Subspace Clustering: Algorithm, Theory, and Applications.
An algorithm to cluster high-dimensional data points that lie in a union of low-dimensional subspaces is proposed and studied, which does not require initialization, can be solved efficiently, and can handle data points near the intersections of subspace.
Robust classification using structured sparse representation
It is shown that casting the face recognition problem as a structured sparse recovery problem can improve the results of the state-of-the-art face recognition algorithms, especially when the authors have relatively small number of training data for each class.
Robust Recovery of Subspace Structures by Low-Rank Representation
It is shown that the convex program associated with LRR solves the subspace clustering problem in the following sense: When the data is clean, LRR exactly recovers the true subspace structures; when the data are contaminated by outliers, it is proved that under certain conditions LRR can exactly recover the row space of the original data.
Low rank subspace clustering (LRSC)