Double Least Squares Pursuit for Sparse Decomposition

@inproceedings{Li2012DoubleLS,
  title={Double Least Squares Pursuit for Sparse Decomposition},
  author={Wanyi Li and Peng Wang and Hong Qiao},
  booktitle={Intelligent Information Processing},
  year={2012}
}
Sparse decomposition has been widely used in numerous applications, such as image processing, pattern recognition, remote sensing and computational biology. Despite plenty of theoretical developments have been proposed, developing, implementing and analyzing novel fast sparse approximation algorithm is still an open problem. In this paper, a new pursuit algorithm Double Least Squares Pursuit (DLSP) is proposed for sparse decomposition. In this algorithm, the support of the solution is obtained… 
Image Recovery based on Local and Nonlocal Regularizations
TLDR
A novel approach based on local and nonlocal regularizations toward exploiting the sparse-gradient property and non local low-rank property (SGLR) has been proposed and demonstrated that SGLR outperforms existing state-of-the-art CS algorithms.

References

SHOWING 1-10 OF 13 REFERENCES
Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
  • T. Cai, Lie Wang
  • Computer Science
    IEEE Transactions on Information Theory
  • 2011
TLDR
It is shown that under conditions on the mutual incoherence and the minimum magnitude of the nonzero components of the signal, the support of the signals can be recovered exactly by the OMP algorithm with high probability.
Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition
TLDR
A modification to the matching pursuit algorithm of Mallat and Zhang (1992) that maintains full backward orthogonality of the residual at every step and thereby leads to improved convergence is proposed.
Sparse and Redundant Representations - From Theory to Applications in Signal and Image Processing
TLDR
This textbook introduces sparse and redundant representations with a focus on applications in signal and image processing and how to use the proper model for tasks such as denoising, restoration, separation, interpolation and extrapolation, compression, sampling, analysis and synthesis, detection, recognition, and more.
Matching pursuits with time-frequency dictionaries
The authors introduce an algorithm, called matching pursuit, that decomposes any signal into a linear expansion of waveforms that are selected from a redundant dictionary of functions. These
Sparse Recovery With Orthogonal Matching Pursuit Under RIP
  • Tong Zhang
  • Computer Science
    IEEE Transactions on Information Theory
  • 2011
TLDR
It is shown that if the restricted isometry property (RIP) is satisfied at sparsity level <i>O</i>(k̅), then OMP can stably recover a k̅-sparse signal in 2-norm under measurement noise.
11-magic : Recovery of sparse signals via convex programming
TLDR
The code can be used in either “small scale” mode, where the system is constructed explicitly and solved exactly, or in “large scale’ modes, where an iterative matrix-free algorithm such as conjugate gradients (CG) is used to approximately solve the system.
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
TLDR
If the objects of interest are sparse in a fixed basis or compressible, then it is possible to reconstruct f to within very high accuracy from a small number of random measurements by solving a simple linear program.
Sparse Approximate Solutions to Linear Systems
The following problem is considered: given a matrix $A$ in ${\bf R}^{m \times n}$, ($m$ rows and $n$ columns), a vector $b$ in ${\bf R}^m$, and ${\bf \epsilon} > 0$, compute a vector $x$ satisfying
Weak greedy algorithms[*]This research was supported by National Science Foundation Grant DMS 9970326 and by ONR Grant N00014‐96‐1‐1003.
  • V. Temlyakov
  • Computer Science, Mathematics
    Adv. Comput. Math.
  • 2000
TLDR
The convergence theorems are proved and estimates for the rate of approximation are given by means of these algorithms, which apply to approximation from an arbitrary dictionary in a Hilbert space.
Decoding by linear programming
TLDR
F can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program) and numerical experiments suggest that this recovery procedure works unreasonably well; f is recovered exactly even in situations where a significant fraction of the output is corrupted.
...
...