Block-Sparse Signals: Uncertainty Relations and Efficient Recovery

@article{Eldar2010BlockSparseSU,
  title={Block-Sparse Signals: Uncertainty Relations and Efficient Recovery},
  author={Yonina C. Eldar and Patrick Kuppinger and Helmut B{\"o}lcskei},
  journal={IEEE Transactions on Signal Processing},
  year={2010},
  volume={58},
  pages={3042-3054}
}
We consider efficient methods for the recovery of block-sparse signals-i.e., sparse signals that have nonzero entries occurring in clusters-from an underdetermined system of linear equations. An uncertainty relation for block-sparse signals is derived, based on a block-coherence measure, which we introduce. We then show that a block-version of the orthogonal matching pursuit algorithm recovers block -sparse signals in no more than steps if the block-coherence is sufficiently small. The same… 

Figures from this paper

A sharp recovery condition for block sparse signals by block orthogonal multi-matching pursuit
TLDR
Making explicit use of block sparsity of block sparse signals can achieve better recovery performance than ignoring the additional structure in the problem as being in the conventional sense.
On the benefits of the block-sparsity structure in sparse signal recovery
  • Hwanjoon Kwon, B. Rao
  • Computer Science
    2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2012
TLDR
It is shown that block-sparse signals can reduce the number of measurements required for exact support recovery, by at least `1/(block size)', compared to conventional or scalar-s parse signals.
GUARANTIES FOR l 1 RECOVERY OF BLOCK-SPARSE SIGNALS By
TLDR
A computationally cheap alternative to block-l1 minimization, the non-Euclidean Block Matching Pursuit algorithm is developed, utilizing verifiable conditions on the problem parameters (sensing matrix and the block structure) which guarantee accurate recovery.
BLOCK SUBSPACE PURSUIT FOR BLOCK-SPARSE SIGNAL RECONSTRUCTION
TLDR
A blocked algorithm based on Subspace Pursuit, namely Block SP (BSP) is presented and it is demonstrated that the BSP algorithm outperforms other methods such as SP, mixed 1 2/l l -norm and BOMP.
Block sparsity and sampling over a union of subspaces
TLDR
This paper studies a block-sparse model, in which the nonzero coefficients are arranged in blocks, based on which a convex relaxation in the form of a mixed ℓ2=ℓ1 program is proposed, and proves isometry-based equivalence properties for this setting.
Analysis of block coherence based on deterministic matrices
TLDR
The outcome presented lies in the fact that exploitation of block sparsity with certain conditions results in successful recovery for a higher sparsity level than treating the signal as conventionally sparse.
Performance guarantees of signal recovery via block-OMP with thresholding
TLDR
A block version of the orthogonal matching pursuit with thresholding (block-OMPT) algorithm is proposed, which works in a less greedy fashion in order to improve the efficiency of the support estimation in iterations.
Recovery of Block-Sparse Representations from Noisy Observations via Orthogonal Matching Pursuit
TLDR
Analysis of a simple, efficient recovery method for recovering the sparsity pattern of block-sparse signals from noise-corrupted measurements reveals that exploiting block-sparsity can improve the recovery ability and lead to a guaranteed recovery for a higher sparsity level.
The high order block RIP condition for signal recovery
TLDR
A high order sufficient condition based on block RIP is obtained to guarantee the stable recovery of all block sparse signals in the presence of noise, and robust recovery when signals are not exactly block sparse via mixed $l_{2}/l_{1}$ minimization.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 56 REFERENCES
Block sparsity and sampling over a union of subspaces
TLDR
This paper studies a block-sparse model, in which the nonzero coefficients are arranged in blocks, based on which a convex relaxation in the form of a mixed ℓ2=ℓ1 program is proposed, and proves isometry-based equivalence properties for this setting.
Robust Recovery of Signals From a Structured Union of Subspaces
TLDR
This paper develops a general framework for robust and efficient recovery of nonlinear but structured signal models, in which x lies in a union of subspaces, and presents an equivalence condition under which the proposed convex algorithm is guaranteed to recover the original signal.
Stable recovery of sparse overcomplete representations in the presence of noise
TLDR
This paper establishes the possibility of stable recovery under a combination of sufficient sparsity and favorable structure of the overcomplete system and shows that similar stability is also available using the basis and the matching pursuit algorithms.
Average Case Analysis of Multichannel Sparse Recovery Using Convex Relaxation
TLDR
Under a very mild condition on the sparsity and on the dictionary characteristics, it is shown that the probability of recovery failure decays exponentially in the number of channels, demonstrating that most of the time, multichannel sparse recovery is indeed superior to single channel methods.
Reduce and Boost: Recovering Arbitrary Sets of Jointly Sparse Vectors
TLDR
To efficiently find the single sparse vector produced by the last reduction step, this paper suggests an empirical boosting strategy that improves the recovery ability of any given suboptimal method for recovering a sparse vector.
Model-Based Compressive Sensing
TLDR
A model-based CS theory is introduced that parallels the conventional theory and provides concrete guidelines on how to create model- based recovery algorithms with provable performance guarantees and a new class of structured compressible signals along with a new sufficient condition for robust structured compressable signal recovery that is the natural counterpart to the restricted isometry property of conventional CS.
Theoretical Results on Sparse Representations of Multiple-Measurement Vectors
  • Jie Chen, X. Huo
  • Computer Science
    IEEE Transactions on Signal Processing
  • 2006
TLDR
Simulations show that the predictions made by the proved theorems tend to be very conservative; this is consistent with some recent advances in probabilistic analysis based on random matrix theory.
Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces
TLDR
This paper considers a more general signal model and assumes signals that live on or close to the union of linear subspaces of low dimension, and presents sampling theorems for this model that are in the same spirit as the Nyquist-Shannon sampling theorem in that they connect the number of required samples to certain model parameters.
Uncertainty Relations for Shift-Invariant Analog Signals
  • Yonina C. Eldar
  • Computer Science
    IEEE Transactions on Information Theory
  • 2009
TLDR
This work considers signals that lie in a finitely generated shift-invariant (SI) space, which is rich enough to include many interesting special cases such as multiband signals and splines and develops an uncertainty principle similar in spirit to its finite counterpart.
Sparse solutions to linear inverse problems with multiple measurement vectors
TLDR
This work considers in depth the extension of two classes of algorithms-Matching Pursuit and FOCal Underdetermined System Solver-to the multiple measurement case so that they may be used in applications such as neuromagnetic imaging, where multiple measurement vectors are available, and solutions with a common sparsity structure must be computed.
...
1
2
3
4
5
...