• Corpus ID: 195218869

Algorithmic Guarantees for Inverse Imaging with Untrained Network Priors

@inproceedings{Jagatap2019AlgorithmicGF,
  title={Algorithmic Guarantees for Inverse Imaging with Untrained Network Priors},
  author={Gauri Jagatap and Chinmay Hegde},
  booktitle={NeurIPS},
  year={2019}
}
Deep neural networks as image priors have been recently introduced for problems such as denoising, super-resolution and inpainting with promising performance gains over hand-crafted image priors such as sparsity and low-rank. [...] Key Method We model images to lie in the range of an untrained deep generative network with a fixed seed.Expand
Phase Retrieval using Untrained Neural Network Priors
TLDR
This paper considers the non-linear inverse problem of compressive phase retrieval (CPR), and model images to lie in the range of an untrained deep generative network with a fixed seed, and presents two approaches for solving CPR — gradient descent, and projected gradient descent.
Low Shot Learning with Untrained Neural Networks for Imaging Inverse Problems
TLDR
This work considers solving linear inverse problems when given a small number of examples of images that are drawn from the same distribution as the image of interest and shows how one can pre-train a neural network with a few given examples to improve reconstruction results in compressed sensing and semantic image recovery problems such as colorization.
On Architecture Selection for Linear Inverse Problems with Untrained Neural Networks
TLDR
This paper seeks to broaden the applicability and understanding of untrained neural network priors by investigating the interaction between architecture selection, measurement models, and signal types by investigating which hyperparameters tend to be more important, and which are robust to deviations from the optimum.
High Dynamic Range Imaging Using Deep Image Priors
  • Gauri Jagatap, C. Hegde
  • Computer Science
    ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2020
TLDR
Two different approaches to high dynamic range (HDR) imaging are considered – gamma encoding and modulo encoding – and a combination of deep image prior and total variation (TV) regularization for reconstructing low-light images is proposed.
Regularizing linear inverse problems with convolutional neural networks
TLDR
This paper demonstrates that signal recovery with a un-trained convolutional network outperforms standard l1 and total variation minimization for magnetic resonance imaging (MRI) and shows that, similar to standard compressive sensing guarantees, on the order of the number of model parameters many measurements suffice for recovering an image from compressive measurements.
DAEs for Linear Inverse Problems: Improved Recovery with Provable Guarantees
TLDR
This work uses Denoising Auto Encoders as priors and a projected gradient descent algorithm for recovering the original signal and finds that the algorithm speeds up recovery by two orders of magnitude, improves quality of reconstruction by an order of magnitude), and does not require tuning hyperparameters.
Compressive sensing with un-trained neural networks: Gradient descent finds the smoothest approximation
TLDR
It is shown that---without any further regularization---an un-trained convolutional neural network can approximately reconstruct signals and images that are sufficiently structured, from a near minimal number of random measurements.
Unrolled Wirtinger Flow with Deep Priors for Phaseless Imaging
We introduce a deep learning (DL) based network for imaging from measurement intensities. The network architecture uses a recurrent structure that unrolls the Wirtinger Flow (WF) algorithm with a
Denoising and Regularization via Exploiting the Structural Bias of Convolutional Generators
TLDR
A step towards demystifying this experimental phenomenon is taken by attributing this effect to particular architectural choices of convolutional networks, namely convolutions with fixed interpolating filters, and it is proved that early-stopped gradient descent denoises/regularizes.
Provably Convergent Algorithms for Solving Inverse Problems Using Generative Models
TLDR
This work establishes a simple nonconvex algorithmic approach that theoretically enjoys linear convergence guarantees for certain linear and nonlinear inverse problems, and empirically improves upon conventional techniques such as back-propagation.
...
1
2
3
...

References

SHOWING 1-10 OF 39 REFERENCES
Compressed Sensing with Deep Image Prior and Learned Regularization
TLDR
It is proved that single-layer DIP networks with constant fraction over-parameterization will perfectly fit any signal through gradient descent, despite being a non-convex problem, which provides justification for early stopping.
Solving Linear Inverse Problems Using Gan Priors: An Algorithm with Provable Guarantees
  • Viraj Shah, C. Hegde
  • Computer Science, Mathematics
    2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2018
TLDR
This work proposes a projected gradient descent (PGD) algorithm for effective use of GAN priors for linear inverse problems, and provides theoretical guarantees on the rate of convergence of this algorithm.
Regularizing linear inverse problems with convolutional neural networks
TLDR
This paper demonstrates that signal recovery with a un-trained convolutional network outperforms standard l1 and total variation minimization for magnetic resonance imaging (MRI) and shows that, similar to standard compressive sensing guarantees, on the order of the number of model parameters many measurements suffice for recovering an image from compressive measurements.
Deep Image Prior
TLDR
It is shown that a randomly-initialized neural network can be used as a handcrafted prior with excellent results in standard inverse problems such as denoising, superresolution, and inpainting.
One Network to Solve Them All — Solving Linear Inverse Problems Using Deep Projection Models
TLDR
This work proposes a general framework to train a single deep neural network that solves arbitrary linear inverse problems and demonstrates superior performance over traditional methods using wavelet sparsity prior while achieving performance comparable to specially-trained networks on tasks including compressive sensing and pixel-wise inpainting.
Deep Decoder: Concise Image Representations from Untrained Non-convolutional Networks
TLDR
This paper proposes an untrained simple image model, called the deep decoder, which is a deep neural network that can generate natural images from very few weight parameters, with a simple architecture with no convolutions and fewer weight parameters than the output dimensionality.
One-dimensional Deep Image Prior for Time Series Inverse Problems
TLDR
The main finding is that properly tuned one-dimensional convolutional architectures provide an excellent Deep Image Prior for various types of temporal signals including audio, biological signals, and sensor measurements.
Deep Compressed Sensing
TLDR
Borrowing insights from the CS perspective, a novel way of improving GANs using gradient information from the discriminator is developed and it is shown that Generative Adversarial Nets (GANs) can be viewed as a special case in this family of models.
prDeep: Robust Phase Retrieval with a Flexible Deep Network
TLDR
This work uses the regularization-by-denoising framework and a convolutional neural network denoiser to create prDeep, a new phase retrieval algorithm that is both robust and broadly applicable and test and validate in simulation to demonstrate that it is robust to noise and can handle a variety of system models.
Convolutional Dictionary Learning via Local Processing
TLDR
This work shows how one can efficiently solve the convolutional sparse pursuit problem and train the filters involved, while operating locally on image patches, and provides an intuitive algorithm that can leverage standard techniques from the sparse representations field.
...
1
2
3
4
...