# Exploiting Statistical Dependencies in Sparse Representations for Signal Recovery

@article{Peleg2012ExploitingSD, title={Exploiting Statistical Dependencies in Sparse Representations for Signal Recovery}, author={Tomer Peleg and Yonina C. Eldar and Michael Elad}, journal={IEEE Transactions on Signal Processing}, year={2012}, volume={60}, pages={2286-2303} }

Signal modeling lies at the core of numerous signal and image processing applications. A recent approach that has drawn considerable attention is sparse representation modeling, in which the signal is assumed to be generated as a combination of a few atoms from a given dictionary. In this work we consider a Bayesian setting and go beyond the classic assumption of independence between the atoms. The main goal of this paper is to introduce a statistical model that takes such dependencies into…

## 130 Citations

Denoising of image patches via sparse representations with learned statistical dependencies

- Computer Science2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2011

This work focuses on the special case of a unitary dictionary and obtains the exact MAP estimate for the sparse representation using an efficient message passing algorithm, and uses a Boltzman machine to model the sparsity pattern.

On MAP and MMSE estimators for the co-sparse analysis model

- Computer ScienceDigit. Signal Process.
- 2014

A greedy algorithm with learned statistics for sparse signal reconstruction

- Computer Science2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2017

This paper analyzes CAMP, which leads to a new interpretation of the update step as a maximum-a-posteriori (MAP) estimation of the non-zero coefficients at each step, and proposes to leverage this idea, by finding a MAP estimate of the sparse reconstruction problem, in a greedy OMP-like way.

Pattern Coupled Sparse Bayesian Learning for Recovery of Time Varying Sparse Signals

- Computer Science
- 2018

A pattern-coupled hierarchical Gaussian pri or model is introduced to characterize the statistical dependencie s among coefficients, in which a set of hyperparameters are employed to control the sparsity of signal coefficients.

Pattern-Coupled Sparse Bayesian Learning for Recovery of Block-Sparse Signals

- Computer ScienceIEEE Transactions on Signal Processing
- 2015

A new sparse Bayesian learning method for recovery of block-sparse signals with unknown cluster patterns by introducing a pattern-coupled hierarchical Gaussian prior to characterize the pattern dependencies among neighboring coefficients, where a set of hyperparameters are employed to control the sparsity of signal coefficients.

Pattern-Coupled Sparse Bayesian Learning for Recovery of Block-Sparse Signals

- Computer ScienceIEEE Trans. Signal Process.
- 2015

A new sparse Bayesian learning method for recovery of block-sparse signals with unknown cluster patterns by introducing a pattern-coupled hierarchical Gaussian prior to characterize the pattern dependencies among neighboring coefficients, where a set of hyperparameters are employed to control the sparsity of signal coefficients.

An Adaptive Markov Random Field for Structured Compressive Sensing

- Computer Science, EngineeringIEEE Transactions on Image Processing
- 2019

A novel adaptive Markov random field sparsity prior for CS is proposed, which not only is able to capture a broad range of sparsity structures, but also can adapt to each sparse signal through refining the parameters of the sparsityPrior with respect to the compressed measurements.

Sparse and Redundant Representation Modeling—What Next?

- Computer ScienceIEEE Signal Processing Letters
- 2012

The story of sparse and redundant representation modeling and its impact is offered, and ten key future research directions in this field are outlined, with many unanswered questions still remaining.

Two-Dimensional Pattern-Coupled Sparse Bayesian Learning via Generalized Approximate Message Passing

- Computer ScienceIEEE Transactions on Image Processing
- 2016

A computationally efficient Bayesian inference method is developed, which integrates the generalized approximate message passing technique with the proposed pattern-coupled hierarchical Gaussian prior model, and offers competitive recovery performance for a range of 2D sparse signal recovery and image processing applications.

## References

SHOWING 1-10 OF 66 REFERENCES

Denoising of image patches via sparse representations with learned statistical dependencies

- Computer Science2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2011

This work focuses on the special case of a unitary dictionary and obtains the exact MAP estimate for the sparse representation using an efficient message passing algorithm, and uses a Boltzman machine to model the sparsity pattern.

On MMSE and MAP Denoising Under Sparse Representation Modeling Over a Unitary Dictionary

- Computer ScienceIEEE Transactions on Signal Processing
- 2011

This analysis establishes a worst-case gain-factor between the MAP/MMSE estimation errors and that of the oracle, and derives explicit expressions for the estimation-error for these two estimators.

K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation

- Computer Science
- 2005

A novel algorithm for adapting dictionaries in order to achieve sparse signal representations, K-SVD, an iterative method that alternates between sparse coding of the examples based on the current dictionary, and a process of updating the dictionary atoms to better fit the data.

$rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation

- Computer ScienceIEEE Transactions on Signal Processing
- 2006

A novel algorithm for adapting dictionaries in order to achieve sparse signal representations, the K-SVD algorithm, an iterative method that alternates between sparse coding of the examples based on the current dictionary and a process of updating the dictionary atoms to better fit the data.

A Plurality of Sparse Representations Is Better Than the Sparsest One Alone

- Computer ScienceIEEE Transactions on Information Theory
- 2009

It is shown that while the maximum a posteriori probability (MAP) estimator aims to find and use the sparsest representation, the minimum mean- squared-error (MMSE) estimators leads to a fusion of representations to form its result, which is a far more accurate estimation in terms of the expected lscr2 -norm error.

Partially Linear Estimation With Application to Sparse Signal Recovery From Measurement Pairs

- Computer ScienceIEEE Transactions on Signal Processing
- 2012

It is shown that the partially linear minimum mean-square error (PLMMSE) estimator does not require knowing the joint distribution of and in full, but rather only its second-order moments, which renders it of potential interest in various applications.

Block-Sparse Signals: Uncertainty Relations and Efficient Recovery

- Computer ScienceIEEE Transactions on Signal Processing
- 2010

The significance of the results presented in this paper lies in the fact that making explicit use of block-sparsity can provably yield better reconstruction properties than treating the signal as being sparse in the conventional sense, thereby ignoring the additional structure in the problem.

Robust Recovery of Signals From a Structured Union of Subspaces

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2009

This paper develops a general framework for robust and efficient recovery of nonlinear but structured signal models, in which x lies in a union of subspaces, and presents an equivalence condition under which the proposed convex algorithm is guaranteed to recover the original signal.

Wavelet-domain compressive signal reconstruction using a Hidden Markov Tree model

- Computer Science2008 IEEE International Conference on Acoustics, Speech and Signal Processing
- 2008

A new algorithm is proposed that enables fast recovery of piecewise smooth signals, a large and useful class of signals whose sparse wavelet expansions feature a distinct "connected tree" structure, and which outperforms the standard compressive recovery algorithms as well as previously proposed wavelet-based recovery algorithms.

Reduce and Boost: Recovering Arbitrary Sets of Jointly Sparse Vectors

- Computer ScienceIEEE Transactions on Signal Processing
- 2008

To efficiently find the single sparse vector produced by the last reduction step, this paper suggests an empirical boosting strategy that improves the recovery ability of any given suboptimal method for recovering a sparse vector.