• Corpus ID: 237940257

Wasserstein Patch Prior for Image Superresolution

@article{Hertrich2021WassersteinPP,
  title={Wasserstein Patch Prior for Image Superresolution},
  author={Johannes Hertrich and Antoine Houdard and Claudia Redenbach},
  journal={ArXiv},
  year={2021},
  volume={abs/2109.12880}
}
In this paper, we introduce a Wasserstein patch prior for superresolution of twoand three-dimensional images. Here, we assume that we have given (additionally to the low resolution observation) a reference image which has a similar patch distribution as the ground truth of the reconstruction. This assumption is e.g. fulfilled when working with texture images or material data. Then, the proposed regularizer penalizes the W2-distance of the patch distribution of the reconstruction to the patch… 

Figures and Tables from this paper

WPPNets: Unsupervised CNN Training with Wasserstein Patch Priors for Image Superresolution
TLDR
WPPNets are introduced, which are CNNs trained by a new unsupervised loss function for image superresolution of materials microstructures which enables them to use in real-world applications, where neither a large database of registered data nor the exact forward operator are given.
WPPNets and WPPFlows: The Power of Wasserstein Patch Priors for Superresolution
TLDR
This paper proposes to learn two kinds of neural networks in an unsupervised way based on WPP loss functions, and shows how convolutional neural networks (CNNs) can be incorporated.
PatchNR: Learning from Small Data by Patch Normalizing Flow Regularization
TLDR
By investigating the distribution of patches versus those of the whole image class, it is proved that the variational model is indeed a MAP approach and the model can be generalized to conditional patchNRs, if additional supervised information is available.
Stochastic Normalizing Flows for Inverse Problems: a Markov Chains Viewpoint
TLDR
This paper considers stochastic normalizing flows from a Markov chain point of view, replacing transition densities by general Markov kernels and establishing proofs via Radon-Nikodym derivatives which allows to incorporate distributions without densities in a sound way.
Generalized Normalizing Flows via Markov Chains
TLDR
This chapter considers stochastic normalizing flows as a pair of Markov chains fulfilling some properties and shows how many state-of-theart models for data generation fit into this framework.

References

SHOWING 1-10 OF 51 REFERENCES
Wasserstein Generative Models for Patch-based Texture Synthesis
TLDR
A framework to train a generative model for texture image synthesis from a single example, exploiting the local representation of images via the space of patches, that is, square sub-images of fixed size, to learn a fully convolutional network for texture generation.
Accelerating GMM-based patch priors for image restoration: Three ingredients for a 100× speed-up
TLDR
The resulting algorithm, which is called the fast-EPLL (FEPLL), attains a dramatic speed-up of two orders of magnitude over EPLL while incurring a negligible drop in the restored image quality (less than 0.5 dB).
From learning models of natural image patches to whole image restoration
TLDR
A generic framework which allows for whole image restoration using any patch based prior for which a MAP (or approximate MAP) estimate can be calculated is proposed and a generic, surprisingly simple Gaussian Mixture prior is presented, learned from a set of natural images.
RAISR: Rapid and Accurate Image Super Resolution
TLDR
This work illustrates how this effective sharpening algorithm, in addition to being of independent interest, can be used as a preprocessing step to induce the learning of more effective upscaling filters with built-in sharpening and contrast enhancement effect.
A non-local algorithm for image denoising
  • A. Buades, B. Coll, J. Morel
  • Computer Science
    2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05)
  • 2005
TLDR
A new measure, the method noise, is proposed, to evaluate and compare the performance of digital image denoising methods, and a new algorithm, the nonlocal means (NL-means), based on a nonlocal averaging of all pixels in the image is proposed.
Deep Learning for Image Super-Resolution: A Survey
TLDR
A survey on recent advances of image super-resolution techniques using deep learning approaches in a systematic way, which can roughly group the existing studies of SR techniques into three major categories: supervised SR, unsupervised SR, and domain-specific SR.
Optimal Patch Assignment for Statistically Constrained Texture Synthesis
TLDR
A new model for patch-based texture synthesis that controls the distribution of patches in the synthesized texture is introduced and it is shown that this model statistically constrains the output texture content, while inheriting the structure-preserving property of patch- based methods.
BM3D Frames and Variational Image Deblurring
TLDR
Simulation experiments show that the decoupled algorithm derived from the GNE formulation demonstrates the best numerical and visual results and shows superiority with respect to the state of the art in the field, confirming a valuable potential of BM3D-frames as an advanced image modeling tool.
Single Image Super-Resolution Using a Joint GMM Method
TLDR
This paper approaches the single image SR problem by using a joint GMM learnt from concatenated vectors of high and low resolution patches sampled from a large database of pairs of high resolution and the corresponding low resolution images.
Texture synthesis by non-parametric sampling
  • Alexei A. Efros, T. Leung
  • Computer Science
    Proceedings of the Seventh IEEE International Conference on Computer Vision
  • 1999
TLDR
A non-parametric method for texture synthesis that aims at preserving as much local structure as possible and produces good results for a wide variety of synthetic and real-world textures.
...
...