Exploiting Statistical Dependencies in Sparse Representations for Signal Recovery

@article{Peleg2012ExploitingSD,
  title={Exploiting Statistical Dependencies in Sparse Representations for Signal Recovery},
  author={Tomer Peleg and Yonina C. Eldar and Michael Elad},
  journal={IEEE Transactions on Signal Processing},
  year={2012},
  volume={60},
  pages={2286-2303}
}
Signal modeling lies at the core of numerous signal and image processing applications. A recent approach that has drawn considerable attention is sparse representation modeling, in which the signal is assumed to be generated as a combination of a few atoms from a given dictionary. In this work we consider a Bayesian setting and go beyond the classic assumption of independence between the atoms. The main goal of this paper is to introduce a statistical model that takes such dependencies into… 

Figures from this paper

Denoising of image patches via sparse representations with learned statistical dependencies
TLDR
This work focuses on the special case of a unitary dictionary and obtains the exact MAP estimate for the sparse representation using an efficient message passing algorithm, and uses a Boltzman machine to model the sparsity pattern.
On MAP and MMSE estimators for the co-sparse analysis model
A greedy algorithm with learned statistics for sparse signal reconstruction
TLDR
This paper analyzes CAMP, which leads to a new interpretation of the update step as a maximum-a-posteriori (MAP) estimation of the non-zero coefficients at each step, and proposes to leverage this idea, by finding a MAP estimate of the sparse reconstruction problem, in a greedy OMP-like way.
Pattern Coupled Sparse Bayesian Learning for Recovery of Time Varying Sparse Signals
TLDR
A pattern-coupled hierarchical Gaussian pri or model is introduced to characterize the statistical dependencie s among coefficients, in which a set of hyperparameters are employed to control the sparsity of signal coefficients.
Pattern-Coupled Sparse Bayesian Learning for Recovery of Block-Sparse Signals
TLDR
A new sparse Bayesian learning method for recovery of block-sparse signals with unknown cluster patterns by introducing a pattern-coupled hierarchical Gaussian prior to characterize the pattern dependencies among neighboring coefficients, where a set of hyperparameters are employed to control the sparsity of signal coefficients.
Pattern-Coupled Sparse Bayesian Learning for Recovery of Block-Sparse Signals
TLDR
A new sparse Bayesian learning method for recovery of block-sparse signals with unknown cluster patterns by introducing a pattern-coupled hierarchical Gaussian prior to characterize the pattern dependencies among neighboring coefficients, where a set of hyperparameters are employed to control the sparsity of signal coefficients.
An Adaptive Markov Random Field for Structured Compressive Sensing
TLDR
A novel adaptive Markov random field sparsity prior for CS is proposed, which not only is able to capture a broad range of sparsity structures, but also can adapt to each sparse signal through refining the parameters of the sparsityPrior with respect to the compressed measurements.
Sparse molecular image representation
Sparse and Redundant Representation Modeling—What Next?
  • Michael Elad
  • Computer Science
    IEEE Signal Processing Letters
  • 2012
TLDR
The story of sparse and redundant representation modeling and its impact is offered, and ten key future research directions in this field are outlined, with many unanswered questions still remaining.
Two-Dimensional Pattern-Coupled Sparse Bayesian Learning via Generalized Approximate Message Passing
TLDR
A computationally efficient Bayesian inference method is developed, which integrates the generalized approximate message passing technique with the proposed pattern-coupled hierarchical Gaussian prior model, and offers competitive recovery performance for a range of 2D sparse signal recovery and image processing applications.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 66 REFERENCES
Denoising of image patches via sparse representations with learned statistical dependencies
TLDR
This work focuses on the special case of a unitary dictionary and obtains the exact MAP estimate for the sparse representation using an efficient message passing algorithm, and uses a Boltzman machine to model the sparsity pattern.
On MMSE and MAP Denoising Under Sparse Representation Modeling Over a Unitary Dictionary
TLDR
This analysis establishes a worst-case gain-factor between the MAP/MMSE estimation errors and that of the oracle, and derives explicit expressions for the estimation-error for these two estimators.
K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation
TLDR
A novel algorithm for adapting dictionaries in order to achieve sparse signal representations, K-SVD, an iterative method that alternates between sparse coding of the examples based on the current dictionary, and a process of updating the dictionary atoms to better fit the data.
$rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation
TLDR
A novel algorithm for adapting dictionaries in order to achieve sparse signal representations, the K-SVD algorithm, an iterative method that alternates between sparse coding of the examples based on the current dictionary and a process of updating the dictionary atoms to better fit the data.
A Plurality of Sparse Representations Is Better Than the Sparsest One Alone
TLDR
It is shown that while the maximum a posteriori probability (MAP) estimator aims to find and use the sparsest representation, the minimum mean- squared-error (MMSE) estimators leads to a fusion of representations to form its result, which is a far more accurate estimation in terms of the expected lscr2 -norm error.
Partially Linear Estimation With Application to Sparse Signal Recovery From Measurement Pairs
TLDR
It is shown that the partially linear minimum mean-square error (PLMMSE) estimator does not require knowing the joint distribution of and in full, but rather only its second-order moments, which renders it of potential interest in various applications.
Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
TLDR
The significance of the results presented in this paper lies in the fact that making explicit use of block-sparsity can provably yield better reconstruction properties than treating the signal as being sparse in the conventional sense, thereby ignoring the additional structure in the problem.
Robust Recovery of Signals From a Structured Union of Subspaces
TLDR
This paper develops a general framework for robust and efficient recovery of nonlinear but structured signal models, in which x lies in a union of subspaces, and presents an equivalence condition under which the proposed convex algorithm is guaranteed to recover the original signal.
Wavelet-domain compressive signal reconstruction using a Hidden Markov Tree model
TLDR
A new algorithm is proposed that enables fast recovery of piecewise smooth signals, a large and useful class of signals whose sparse wavelet expansions feature a distinct "connected tree" structure, and which outperforms the standard compressive recovery algorithms as well as previously proposed wavelet-based recovery algorithms.
Reduce and Boost: Recovering Arbitrary Sets of Jointly Sparse Vectors
TLDR
To efficiently find the single sparse vector produced by the last reduction step, this paper suggests an empirical boosting strategy that improves the recovery ability of any given suboptimal method for recovering a sparse vector.
...
1
2
3
4
5
...