Refitting Solutions Promoted by ℓ _12 Sparse Analysis Regularizations with Block Penalties

@inproceedings{Deledalle2019RefittingSP,
  title={Refitting Solutions Promoted by ℓ \_12 Sparse Analysis Regularizations with Block Penalties},
  author={Charles-Alban Deledalle and Nicolas Papadakis and Joseph Salmon and Samuel Vaiter},
  booktitle={SSVM},
  year={2019}
}
In inverse problems, the use of an $\ell_{12}$ analysis regularizer induces a bias in the estimated solution. We propose a general refitting framework for removing this artifact while keeping information of interest contained in the biased solution. This is done through the use of refitting block penalties that only act on the co-support of the estimation. Based on an analysis of related works in the literature, we propose a new penalty that is well suited for refitting purposes. We also… 

Figures, Tables, and Topics from this paper

Block based refitting in $\ell_{12}$ sparse regularisation
In many linear regression problems, including ill-posed inverse problems in image restoration, the data exhibit some sparse structures that can be used to regularize the inversion. To this end, a
Block-Based Refitting in $$\ell _{12}$$ ℓ 12 Sparse Regularization
TLDR
This work introduces a new penalty that is well suited for refitting purposes and presents a new algorithm to obtain the refitted solution along with the original (biased) solution for any convex refitting block penalty.

References

SHOWING 1-10 OF 14 REFERENCES
On Lasso refitting strategies
A well-know drawback of l1-penalized estimators is the systematic shrinkage of the large coefficients towards zero. A simple remedy is to treat Lasso as a model-selection procedure and to perform a
Bias Reduction in Variational Regularization
TLDR
The two-step debiasing method for variational regularization is shown to be well-defined and to optimally reduce bias in a certain setting and to be comparable to optimal results obtained with Bregman iterations.
CLEAR: Covariant LEAst-Square Refitting with Applications to Image Restoration
TLDR
A new framework to remove parts of the systematic errors affecting popular restoration algorithms, with a special focus for image processing tasks, is proposed, and an approach re-fitting the results of standard methods towards the input data is developed.
Trust, but verify: benefits and pitfalls of least-squares refitting in high dimensions
Least-squares refitting is widely used in high dimensional regression to reduce the prediction bias of l1-penalized estimators (e.g., Lasso and Square-Root Lasso). We present theoretical and
Least squares after model selection in high-dimensional sparse models
In this paper we study post-model selection estimators which apply ordinary least squares (ols) to the model selected by first-step penalized estimators, typically lasso. It is well known that lasso
Exponential Screening and optimal rates of sparse estimation
In high-dimensional linear regression, the goal pursued here is to estimate an unknown regression function using linear combinations of a suitable set of covariates. One of the key assumptions for
A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
  • A. Chambolle, T. Pock
  • Mathematics, Computer Science
    Journal of Mathematical Imaging and Vision
  • 2010
TLDR
A first-order primal-dual algorithm for non-smooth convex optimization problems with known saddle-point structure can achieve O(1/N2) convergence on problems, where the primal or the dual objective is uniformly convex, and it can show linear convergence, i.e. O(ωN) for some ω∈(0,1), on smooth problems.
On Debiasing Restoration Algorithms: Applications to Total-Variation and Nonlocal-Means
TLDR
The debiasing technique can be used for any locally affine estimator including \(\ell _1\) regularization, anisotropic total-variation and some nonlocal filters.
An Iterative Regularization Method for Total Variation-Based Image Restoration
We introduce a new iterative regularization procedure for inverse problems based on the use of Bregman distances, with particular focus on problems arising in image processing. We are motivated by
A Multiscale Image Representation Using Hierarchical (BV, L2 ) Decompositions
TLDR
A new multiscale image decomposition which offers a hierarchical, adaptive representation for the different features in general images is proposed and the questions of convergence, energy decomposition, localization, and adaptivity are discussed.
...
1
2
...