Sparse Recovery via Partial Regularization: Models, Theory and Algorithms

@article{Lu2015SparseRV,
title={Sparse Recovery via Partial Regularization: Models, Theory and Algorithms},
author={Zhaosong Lu and Xiaorui Li},
journal={ArXiv},
year={2015},
volume={abs/1511.07293}
}
In the context of sparse recovery, it is known that most of existing regularizers such as $\ell_1$ suffer from some bias incurred by some leading entries (in magnitude) of the associated vector. To neutralize this bias, we propose a class of models with partial regularizers for recovering a sparse solution of a linear system. We show that every local minimizer of these models is sufficiently sparse or the magnitude of all its nonzero entries is above a uniform constant depending only on the… CONTINUE READING
3

Citations

Publications citing this paper.
SHOWING 1-8 OF 8 CITATIONS

Logistic Regression Confined by Cardinality-Constrained Sample and Feature Selection.

• IEEE transactions on pattern analysis and machine intelligence
• 2019
VIEW 4 EXCERPTS
CITES BACKGROUND & METHODS

Efficient Recovery of Low-Rank Matrix via Double Nonconvex Nonsmooth Rank Minimization

Hengmin Zhang, +3 authors Jian Yang
• IEEE Transactions on Neural Networks and Learning Systems
• 2019
VIEW 4 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

VIEW 1 EXCERPT
CITES METHODS

• ArXiv
• 2018

Nonconvex penalties with analytical solutions for one-bit compressive sensing

• Signal Processing
• 2018
VIEW 1 EXCERPT
CITES BACKGROUND

A successive difference-of-convex approximation method for a class of nonconvex nonsmooth optimization problems

• Math. Program.
• 2017

Robust Signal Recovery With Highly Coherent Measurement Matrices

• IEEE Signal Processing Letters
• 2017

Truncated l1-2 Models for Sparse Recovery and Rank Minimization

• SIAM J. Imaging Sciences
• 2017
VIEW 1 EXCERPT
CITES BACKGROUND

References

Publications referenced by this paper.
SHOWING 1-10 OF 43 REFERENCES

Analysis of multi-stage convex relaxation for sparse regularization

T. Zhang
• J. Mach. Learn. Res., 11:1081–1107
• 2010
VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

Nearly unbiased variable selection under minimax concave penalty

VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

Enhancing Sparsity by Reweighted ℓ1 Minimization

VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

Use of the Zero-Norm with Linear Models and Kernel Methods

• J. Mach. Learn. Res.
• 2003
VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties

• 2001
VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

Sharp RIP Bound for Sparse Signal and Low-Rank Matrix Recovery

• ArXiv
• 2013
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Sparse Representation of a Polytope and Recovery of Sparse Signals and Low-Rank Matrices

• IEEE Transactions on Information Theory
• 2013
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Decoding by linear programming

• IEEE Transactions on Information Theory
• 2005
VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Regression Shrinkage and Selection via the Lasso

VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL

Nonlinear analysis

A. J. Hoffman
• differential equations and control. J. Res. Nat. Bur. Stand., 49(4):263– 265
• 1952
VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL