Automatic parameter selection for the TGV regularizer in image restoration under Poisson noise

@article{Serafino2022AutomaticPS,
  title={Automatic parameter selection for the TGV regularizer in image restoration under Poisson noise},
  author={Daniela di Serafino and Monica Pragliola},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.13439}
}
We address the image restoration problem under Poisson noise corruption. The Kullback-Leibler divergence, which is typically adopted in the variational framework as data fidelity term in this case, is coupled with the second-order Total Generalized Variation (TGV 2 ). The TGV 2 regularizer is known to be capable of preserving both smooth and piece-wise constant features in the image, however its behavior is subject to a suitable setting of the parameters arising in its expression. We propose a… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 34 REFERENCES
Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
TLDR
It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Image quality assessment: from error visibility to structural similarity
TLDR
A structural similarity index is developed and its promise is demonstrated through a set of intuitive examples, as well as comparison to both subjective ratings and state-of-the-art objective methods on a database of images compressed with JPEG and JPEG2000.
Directional TGV-Based Image Restoration under Poisson Noise
TLDR
This article focuses on images corrupted by Poisson noise, extending the DTGV regularization to image restoration models where the data fitting term is the generalized Kullback–Leibler divergence, and proposes a technique for the identification of the main texture direction, which improves upon the techniques used in the work about DTGV.
Efficient gradient projection methods for edge-preserving removal of Poisson noise
  • 2009
Nearly Exact Discrepancy Principle for Low-Count Poisson Image Restoration
The effectiveness of variational methods for restoring images corrupted by Poisson noise strongly depends on the suitable selection of the regularization parameter balancing the effect of the
Automatic fidelity and regularization terms selection in variational image restoration
TGV-based restoration of Poissonian images with automatic estimation of the regularization parameter
TLDR
The Total Generalized Variation regularization introduced in [SIAM J. Imag. Sci, 3(3), 492–526, 2010] is considered, which has demonstrated its ability of preserving sharp features as well as smooth transition variations, and an automatic strategy for defining the value of the regularization parameter is introduced.
Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach Part I: Methodology and Experiments
TLDR
A general empirical Bayesian method for setting regularisation parameters in imaging problems that are convex w.r.t. the unknown image, which uses the same basic operators as proximal optimisation algorithms, namely gradient and proximal operators, and it is therefore straightforward to apply to Problems that are currently solved by using proximal Optimisation techniques.
Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors
TLDR
The optimization problem behind the MAP estimation is analyzed and hyperparameter combinations that lead to a globally or locally convex optimization problem are identified and identified.
Hierachical Bayesian models and sparsity: ℓ2-magic
TLDR
This work reformulates the question of sparse recovery as an inverse problem in the Bayesian framework, expresses the sparsity belief by means of a hierachical prior model and shows that the maximum a posteriori (MAP) solution computed by a recently proposed iterative alternating sequential (IAS) algorithm converges linearly to the unique minimum for any matrix, and quadratically on the complement of the support of the minimizer.
...
...