• Corpus ID: 239768507

# Learning convex regularizers satisfying the variational source condition for inverse problems

@article{Mukherjee2021LearningCR,
title={Learning convex regularizers satisfying the variational source condition for inverse problems},
author={Subhadip Mukherjee and Carola-Bibiane Sch{\"o}nlieb and Martin Burger},
journal={ArXiv},
year={2021},
volume={abs/2110.12520}
}
• Published 24 October 2021
• Computer Science, Engineering
• ArXiv
Variational regularization has remained one of the most successful approaches for reconstruction in imaging inverse problems for several decades. With the emergence and astonishing success of deep learning in recent years, a considerable amount of research has gone into data-driven modeling of the regularizer in the variational setting. Our work extends a recently proposed method, referred to as adversarial convex regularization (ACR), that seeks to learn data-driven convex regularizers via…

## References

SHOWING 1-10 OF 13 REFERENCES
• Computer Science, Mathematics
NeurIPS
• 2018
This work proposes a new framework for applying data-driven approaches to inverse problems, using a neural network as a regularization functional, that can be applied even if only unsupervised training data is available.
Total Deep Variation for Linear Inverse Problems
• Computer Science, Mathematics
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
• 2020
This paper proposes a novel learnable general-purpose regularizer exploiting recent architectural design patterns from deep learning and casts the learning problem as a discrete sampled optimal control problem, for which the adjoint state equations and an optimality condition are derived.
Improved Training of Wasserstein GANs
• Computer Science, Mathematics
NIPS
• 2017
This work proposes an alternative to clipping weights: penalize the norm of gradient of the critic with respect to its input, which performs better than standard WGAN and enables stable training of a wide variety of GAN architectures with almost no hyperparameter tuning.
Convergence rates of convex variational regularization
• Mathematics
• 2004
The aim of this paper is to provide quantitative estimates for the minimizers of non-quadratic regularization problems in terms of the regularization parameter, respectively the noise level. As usual
Deep Convolutional Neural Network for Inverse Problems in Imaging
• Computer Science, Mathematics
IEEE Transactions on Image Processing
• 2017
The proposed network outperforms total variation-regularized iterative reconstruction for the more realistic phantoms and requires less than a second to reconstruct a <inline-formula> <tex-math notation="LaTeX">$512\times 512$ </tex- math></inline- formula> image on the GPU.
NETT: Solving Inverse Problems with Deep Neural Networks
• Computer Science, Mathematics
ArXiv
• 2018
A complete convergence analysis is established for the proposed NETT (Network Tikhonov) approach to inverse problems, which considers data consistent solutions having small value of a regularizer defined by a trained neural network, and proposes a possible strategy for training the regularizer.
Modern regularization methods for inverse problems
• Computer Science, Mathematics
Acta Numerica
• 2018
The aim of this paper is to provide a reasonably comprehensive overview of this shift towards modern nonlinear regularization methods, including their analysis, applications and issues for future research.
Input Convex Neural Networks
• Computer Science, Mathematics
ICML
• 2017
This paper presents the input convex neural network architecture. These are scalar-valued (potentially deep) neural networks with constraints on the network parameters such that the output of the
Solving inverse problems using data-driven models
• Computer Science
Acta Numerica
• 2019
This survey paper aims to give an account of some of the main contributions in data-driven inverse problems.
Regularization Methods in Banach Spaces
• Computer Science, Mathematics
Radon Series on Computational and Applied Mathematics
• 2012
This work investigates regularization methods aimed at finding stable approximate solutions for linear and nonlinear operator equations in Banach spaces using general Lp-norms or the BV-norm.