• Corpus ID: 239616552

# Block based refitting in $\ell_{12}$ sparse regularisation

@inproceedings{Deledalle2019BlockBR,
title={Block based refitting in \$\ell\_\{12\}\$ sparse regularisation},
author={Charles-Alban Deledalle and Nicolas Papadakis and Joseph Salmon and Samuel Vaiter},
year={2019}
}
• Published 22 October 2019
• Computer Science, Mathematics
In many linear regression problems, including ill-posed inverse problems in image restoration, the data exhibit some sparse structures that can be used to regularize the inversion. To this end, a classical path is to use `12 block based regularization. While efficient at retrieving the inherent sparsity patterns of the data – the support – the estimated solutions are known to suffer from a systematical bias. We propose a general framework for removing this artifact by refitting the solution…

## References

SHOWING 1-10 OF 56 REFERENCES
Refitting Solutions Promoted by ℓ _12 Sparse Analysis Regularizations with Block Penalties
• Computer Science
SSVM
• 2019
This work proposes a new penalty that is well suited for refitting purposes and presents an efficient algorithmic method to obtain the refitted solution along with the original (biased) solution for any convex refitting block penalty.
CLEAR: Covariant LEAst-Square Refitting with Applications to Image Restoration
• Computer Science
SIAM J. Imaging Sci.
• 2017
A new framework to remove parts of the systematic errors affecting popular restoration algorithms, with a special focus for image processing tasks, is proposed, and an approach re-fitting the results of standard methods towards the input data is developed.
Bias Reduction in Variational Regularization
• Mathematics
Journal of Mathematical Imaging and Vision
• 2017
The two-step debiasing method for variational regularization is shown to be well-defined and to optimally reduce bias in a certain setting and to be comparable to optimal results obtained with Bregman iterations.
Group sparsity with overlapping partition functions
• Computer Science
2011 19th European Signal Processing Conference
• 2011
Two schemes are developed, one primal and another primal-dual, originating from the non-smooth convex optimization realm, to efficiently solve a wide class of inverse problems regularized using this overlapping group sparsity prior.
Robust Sparse Analysis Regularization
• Mathematics, Computer Science
IEEE Transactions on Information Theory
• 2013
This paper gives a sufficient condition to ensure that a signal is the unique solution of the l1 -analysis regularization in the noiseless case, and introduces a stronger sufficient condition for the robustness of the sign pattern.
Model Consistency of Partly Smooth Regularizers
• Mathematics, Computer Science
IEEE Transactions on Information Theory
• 2018
This paper unifies and generalizes a large body of literature, where model consistency was known to hold, and shows that under the deterministic model selection conditions, the forward–backward proximal splitting algorithm used to solve the penalized least-square regression problem is guaranteed to identify the model manifold after a finite number of iterations.
On Lasso refitting strategies
• Computer Science, Mathematics
Bernoulli
• 2019
This work formalizes the notion of refitting and provides oracle bounds for arbitrary refitting procedures of the Lasso solution, and defines a sign-consistent refitting as an arbitraryRefitting procedure, preserving the signs of the first step Lasso Solution and provide Oracle inequalities for such estimators.
A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
• Mathematics, Computer Science
Journal of Mathematical Imaging and Vision
• 2010
A first-order primal-dual algorithm for non-smooth convex optimization problems with known saddle-point structure can achieve O(1/N2) convergence on problems, where the primal or the dual objective is uniformly convex, and it can show linear convergence, i.e. O(ωN) for some ω∈(0,1), on smooth problems.
How to SAIF-ly Boost Denoising Performance
• Engineering
IEEE Transactions on Image Processing
• 2013
Experiments illustrate that the proposed spatially adaptive iterative filtering (SAIF) strategy can significantly relax the base algorithm's sensitivity to its tuning (smoothing) parameters, and effectively boost the performance of several existing denoising filters to generate state-of-the-art results under both simulated and practical conditions.