# Ground Truth Free Denoising by Optimal Transport

@article{Dittmer2020GroundTF, title={Ground Truth Free Denoising by Optimal Transport}, author={S{\"o}ren Dittmer and Carola-Bibiane Sch{\"o}nlieb and Peter Maass}, journal={ArXiv}, year={2020}, volume={abs/2007.01575} }

We present a learned unsupervised denoising method for arbitrary types of data, which we explore on images and one-dimensional signals. The training is solely based on samples of noisy data and examples of noise, which -- critically -- do not need to come in pairs. We only need the assumption that the noise is independent and additive (although we describe how this can be extended). The method rests on a Wasserstein Generative Adversarial Network setting, which utilizes two critics and one…

## Figures, Tables, and Topics from this paper

## One Citation

Shared Prior Learning of Energy-Based Models for Image Reconstruction

- Computer Science, EngineeringSIAM J. Imaging Sci.
- 2021

We propose a novel learning-based framework for image reconstruction particularly designed for training without ground truth data, which has three major building blocks: energy-based learning, a…

## References

SHOWING 1-10 OF 32 REFERENCES

Unsupervised Learning with Stein's Unbiased Risk Estimator

- Computer Science, MathematicsArXiv
- 2018

It is shown that Stein's Unbiased Risk Estimator (SURE) and its generalizations can be used to train convolutional neural networks (CNNs) for a range of image denoising and recovery problems without any ground truth data.

Adversarial Regularizers in Inverse Problems

- Computer Science, MathematicsNeurIPS
- 2018

This work proposes a new framework for applying data-driven approaches to inverse problems, using a neural network as a regularization functional, that can be applied even if only unsupervised training data is available.

AmbientGAN: Generative models from lossy measurements

- Computer ScienceICLR
- 2018

This work considers the task of learning an implicit generative model given only lossy measurements of samples from the distribution of interest, and proposes a new method of training Generative Adversarial Networks (GANs) which is called AmbientGAN.

Improved Training of Wasserstein GANs

- Computer Science, MathematicsNIPS
- 2017

This work proposes an alternative to clipping weights: penalize the norm of gradient of the critic with respect to its input, which performs better than standard WGAN and enables stable training of a wide variety of GAN architectures with almost no hyperparameter tuning.

Noise2Noise: Learning Image Restoration without Clean Data

- Computer Science, MathematicsICML
- 2018

It is shown that under certain common circumstances, it is possible to learn to restore signals without ever observing clean ones, at performance close or equal to training using clean exemplars.

Generative Adversarial Nets

- Computer ScienceNIPS
- 2014

We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a…

Variational Networks: Connecting Variational Methods and Deep Learning

- Computer ScienceGCPR
- 2017

Surprisingly, in numerical experiments on image reconstruction problems it turns out that giving up exact minimization leads to a consistent performance increase, in particular in the case of convex models.

Training Deep Learning based Denoisers without Ground Truth Data

- Computer Science, MathematicsNeurIPS
- 2018

This work demonstrated that the proposed Stein's Unbiased Risk Estimator (SURE) based method only with noisy input data was able to train CNN based denoising networks that yielded performance close to that of the original MSE based deep learning denoisers with ground truth data.

Neural Adaptive Image Denoiser

- Computer Science2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2018

The Neural AIDE algorithm with a plain fully connected architecture is shown to attain a competitive denoising performance on benchmark datasets compared to the strong baselines and can robustly correct the mismatched noise level in the supervised learning via fine-tuning.

Wasserstein GAN

- Mathematics, Computer ScienceArXiv
- 2017

The problem this paper is concerned with is that of unsupervised learning, what does it mean to learn a probability distribution and how to define a parametric family of densities.