A Fast Approach for Overcomplete Sparse Decomposition Based on Smoothed $\ell ^{0}$ Norm

@article{Mohimani2009AFA,
  title={A Fast Approach for Overcomplete Sparse Decomposition Based on Smoothed \$\ell ^\{0\}\$ Norm},
  author={G. Hosein Mohimani and Massoud Babaie-Zadeh and Christian Jutten},
  journal={IEEE Transactions on Signal Processing},
  year={2009},
  volume={57},
  pages={289-301}
}
In this paper, a fast algorithm for overcomplete sparse decomposition, called SL0, is proposed. The algorithm is essentially a method for obtaining sparse solutions of underdetermined systems of linear equations, and its applications include underdetermined sparse component analysis (SCA), atomic decomposition on overcomplete dictionaries, compressed sensing, and decoding real field codes. Contrary to previous methods, which usually solve this problem by minimizing the l 1 norm using linear… 
Sparse Recovery using Smoothed $\ell^0$ (SL0): Convergence Analysis
TLDR
The convergence properties of SL0 are studied, and it is shown that under a certain sparsity constraint in terms of Asymmetric Restricted Isometry Property (ARIP), and with a certain choice of parameters, the convergence ofSL0 to the sparsest solution is guaranteed.
An Improved Smoothed $\ell^0$ Approximation Algorithm for Sparse Representation
TLDR
An upper bound on the run-time estimation error is given and a reliable stopping criterion is developed, which is helpful in avoiding the problems due to the underlying discontinuities of the l0 cost function.
Non-negative sparse decomposition based on constrained smoothed ℓ0 norm
TLDR
A new approach based on the Constrained Smoothed L0 norm (CSL0) for solving sparse decomposition problems with non-negative constraint is presented and the performance of the new sparse approach is evaluated on both simulated and real data.
New Improved Algorithms for Compressive Sensing Based on $\ell_{p}$ Norm
TLDR
The improved version of the ℓp-RLS algorithm offers better performance than the basic version, although this is achieved at the cost of increased computational effort.
Sparse Signal Recovery Using Iterative Proximal Projection
TLDR
This paper considers minimization of a nonsmooth and nonconvex sparsity promoting function subject to an error constraint and uses an alternating minimization penalty method, which ends up with an iterative proximal-projection approach.
Reconstruction of block-sparse signals by using an l2/p-regularized least-squares algorithm
TLDR
Simulation results are presented which show that for large-size data the proposed algorithm yields improved reconstruction performance and requires a reduced amount of computation relative to several known algorithms.
A Fast Sparse Recovery Algorithm for Compressed Sensing Using Approximate l0 Norm and Modified Newton Method
TLDR
A fast sparse recovery algorithm based on the approximate l₀ norm (FAL0), which is helpful in improving the practicability of the compressed sensing theory and achieves nearly the same accuracy as other algorithms, improving the signal recovery efficiency under the same conditions.
Iteratively re-weighted least squares for sparse signal reconstruction from noisy measurements
  • R. Carrillo, K. Barner
  • Mathematics, Computer Science
    2009 43rd Annual Conference on Information Sciences and Systems
  • 2009
TLDR
This paper studies an iterative reweighted least squares (IRLS) approach to find sparse solutions of underdetermined system of equations based on smooth approximation of the L0 norm and the method is extended to finding sparse solutions from noisy measurements.
Sparse Decomposition over non-full-rank dictionaries
TLDR
This paper considers non- full-rank dictionaries (which are not even necessarily overcomplete), and extends the definition of SD over these dictionaries, and presents an approach which enables to use previously developed SD algorithms for this non-full-rank case.
Fast sparse recovery via non-convex optimization
  • Laming Chen, Yuantao Gu
  • Mathematics, Computer Science
    2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP)
  • 2015
TLDR
Simulation results verify theoretical rate of convergence, and demonstrate that the algorithm outperforms its convex counterpart in various aspects including more nonzero entries allowed, less running time required, and better denoising performance exhibited.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 64 REFERENCES
Fast Sparse Representation Based on Smoothed l0 Norm
TLDR
It is experimentally shown that the proposed SCA or atomic decomposition on over-complete dictionaries algorithm is about two orders of magnitude faster than the state-of-the-art l1-magic, while providing the same (or better) accuracy.
Maximal Sparsity Representation via l 1 Minimization
Finding a sparse representation of signals is desired in many applications. For a representation dictionary D and a given signal S 2 spanfDg, we are interested in nding the sparsest vector such that
Stable recovery of sparse overcomplete representations in the presence of noise
TLDR
This paper establishes the possibility of stable recovery under a combination of sufficient sparsity and favorable structure of the overcomplete system and shows that similar stability is also available using the basis and the matching pursuit algorithms.
An Iterative Thresholding Algorithm for Linear Inverse Problems with a Sparsity Constraint
We consider linear inverse problems where the solution is assumed to have a sparse expansion on an arbitrary preassigned orthonormal basis. We prove that replacing the usual quadratic regularizing
Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization
  • D. Donoho, Michael Elad
  • Computer Science, Medicine
    Proceedings of the National Academy of Sciences of the United States of America
  • 2003
TLDR
This article obtains parallel results in a more general setting, where the dictionary D can arise from two or several bases, frames, or even less structured systems, and sketches three applications: separating linear features from planar ones in 3D data, noncooperative multiuser encoding, and identification of over-complete independent component models.
Sparse signal reconstruction from limited data using FOCUSS: a re-weighted minimum norm algorithm
TLDR
A view of the algorithm as a novel optimization method which combines desirable characteristics of both classical optimization and learning-based algorithms is provided and Mathematical results on conditions for uniqueness of sparse solutions are also given.
For most large underdetermined systems of linear equations the minimal 1-norm solution is also the sparsest solution
We consider linear equations y = Φx where y is a given vector in ℝn and Φ is a given n × m matrix with n 0 so that for large n and for all Φ's except a negligible fraction, the following property
Analysis of Sparse Representation and Blind Source Separation
TLDR
The recoverability analysis shows that this two-stage cluster-then-l1-optimization approach for sparse representation of a data matrix can deal with the situation in which the sources are overlapped to some degree in the analyzed BSS.
Atomic Decomposition by Basis Pursuit
TLDR
Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
A Coding Theory Approach to Noisy Compressive Sensing Using Low Density Frames
TLDR
This work explicitly constructs a class of measurement matrices inspired by coding theory, referred to as low density frames, and develops decoding algorithms that produce an accurate estimate x̂ even in the presence of additive noise, which are implemented in O(Mdv2) complexity.
...
1
2
3
4
5
...