• Corpus ID: 238744292

# The springback penalty for robust signal recovery

@article{An2021TheSP,
title={The springback penalty for robust signal recovery},
author={Congpei An and Hao-Ning Wu and Xiaoming Yuan},
journal={ArXiv},
year={2021},
volume={abs/2110.06754}
}
• Congpei An, Xiaoming Yuan
• Published 13 October 2021
• Computer Science, Mathematics
• ArXiv
We propose a new penalty, named as the springback penalty, for constructing models to recover an unknown signal from incomplete and inaccurate measurements. Mathematically, the springback penalty is a weakly convex function, and it bears various theoretical and computational advantages of both the benchmark convex l1 penalty and many of its non-convex surrogates that have been well studied in the literature. For the recovery model using the springback penalty, we establish the exact and stable…
1 Citations

## Figures and Tables from this paper

Enhanced total variation minimization for stable image reconstruction
• Congpei An, Hao-Ning Wu, Xiaoming Yuan
• Computer Science, Engineering
ArXiv
• 2022
The backward diffusion process in the earlier literature of image enhancement with the TV regularization is proposed and the resulting enhanced TV minimization model is shown to have tighter reconstruction error bounds than various TV-based models for the scenario where the level of noise is significant and the amount of measurements is limited.

## References

SHOWING 1-10 OF 41 REFERENCES
Minimization of ℓ1-2 for Compressed Sensing
• Mathematics, Computer Science
SIAM J. Sci. Comput.
• 2015
A sparsity oriented simulated annealing procedure with non-Gaussian random perturbation is proposed and the almost sure convergence of the combined algorithm (DCASA) to a global minimum is proved.
Compressed Sensing Recovery via Nonconvex Shrinkage Penalties
• Mathematics, Computer Science
ArXiv
• 2015
This work proves that given data and a measurement matrix from a broad class of matrices, one can choose parameters for these classes of shrinkages to guarantee exact recovery of the sparsest solution.
Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed 퓁q Minimization
• Mathematics, Computer Science
SIAM J. Numer. Anal.
• 2013
This paper starts with a preliminary yet novel analysis for unconstrained $\ell_q$ minimization, which includes convergence, error bound, and local convergence behavior, and extends the algorithm and analysis to the recovery of low-rank matrices.
Restricted isometry properties and nonconvex compressive sensing
• Mathematics
• 2007
The recently emerged field known as compressive sensing has produced powerful results showing the ability to recover sparse signals from surprisingly few linear measurements, using l1 minimization.
A unified approach to model selection and sparse recovery using regularized least squares
• Mathematics
• 2009
Model selection and sparse recovery are two important problems for which many regularization methods have been proposed. We study the properties of regularization methods in both problems under the
Minimization of Transformed L1 Penalty: Closed Form Representation and Iterative Thresholding Algorithms
• Mathematics, Computer Science
ArXiv
• 2014
Here, an explicit fixed point representation for the TL1 regularized minimization problem is developed and a proposed TL1 iterative thresholding algorithm with adaptive subcritical and supercritical thresholds consistently performs the best in sparse signal recovery with and without measurement noise.
Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems
• Computer Science
IEEE Journal of Selected Topics in Signal Processing
• 2007
This paper proposes gradient projection algorithms for the bound-constrained quadratic programming (BCQP) formulation of these problems and test variants of this approach that select the line search parameters in different ways, including techniques based on the Barzilai-Borwein method.
Stable signal recovery from incomplete and inaccurate measurements
• Physics, Mathematics
• 2005
Suppose we wish to recover a vector x_0 Є R^m (e.g., a digital signal or image) from incomplete and contaminated observations y = Ax_0 + e; A is an n by m matrix with far fewer rows than columns (n «
Stable sparse approximations via nonconvex optimization
• Mathematics, Computer Science
2008 IEEE International Conference on Acoustics, Speech and Signal Processing
• 2008
These results indicate that depending on the restricted isometry constants and the noise level, lscrp minimization with certain values of p < 1 provides better theoretical guarantees in terms of stability and robustness than lscR1 minimization does.
On sparse reconstruction from Fourier and Gaussian measurements
• Mathematics
• 2008
This paper improves upon best-known guarantees for exact reconstruction of a sparse signal f from a small universal sample of Fourier measurements. The method for reconstruction that has recently