Score-Guided Intermediate Layer Optimization: Fast Langevin Mixing for Inverse Problems

  title={Score-Guided Intermediate Layer Optimization: Fast Langevin Mixing for Inverse Problems},
  author={Giannis Daras and Yuval Dagan and Alexandros G. Dimakis and Constantinos Daskalakis},
  booktitle={International Conference on Machine Learning},
We prove fast mixing and characterize the stationary distribution of the Langevin Algorithm for inverting random weighted DNN generators. This result extends the work of Hand and Voroninski from efficient inversion to efficient posterior sampling. In practice, to allow for increased expressivity, we propose to do posterior sampling in the latent space of a pre-trained generative model. To achieve that, we train a score-based model in the latent space of a StyleGAN-2 and we use it to solve… 

Figures and Tables from this paper

Differentiable Gaussianization Layers for Inverse Problems Regularized by Deep Generative Models

This work proposes to reparameterize and Gaussianize the latent tensors using novel differentiable data-dependent layers wherein custom operators are defined by solving optimization problems to constrain inverse problems to obtain high-fidelity in-distribution solutions.

Removing Structured Noise with Diffusion Models

It is shown that the powerful paradigm of posterior sampling with diffusion models can be extended to include rich, structured, noise models, and a joint conditional reverse diffusion process with learned scores for the noise and signal-generating distribution is proposed.

Theoretical Perspectives on Deep Learning Methods in Inverse Problems

This paper surveys some of the prominent theoretical developments in the use of deep learning methods in inverse problems such as denoising, compressive sensing, inpainting, and super-resolution, focusing in particular on generative priors, untrained neural network priors and unfolding algorithms.

Quantized Compressed Sensing with Score-Based Generative Models

This work proposes an unsupervised data-driven approach called quantized compressed sensing with SGM (QCS-SGM), where the prior distribution is modeled by a pre-trained SGM, and an annealed pseudo-likelihood score is introduced and combined with the prior score of SGM to perform posterior sampling.

Consistent Diffusion Models: Mitigating Sampling Drift by Learning to be Consistent

This work proposes to enforce a \emph{consistency} property which states that predictions of the model on its own generated data are consistent across time, and shows that if the score is learned perfectly on some non-drifted points and if the consistency property is enforced everywhere, then the scores are learned accurately everywhere.

Robust Unsupervised StyleGAN Image Restoration

This work makes StyleGAN image restoration robust: a single set of hyperparameters works across a wide range of degradation levels, which makes it possible to handle combinations of several degradations, without the need to retune.

Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic Analysis For DDIM-Type Samplers

We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling. Several recent works have analyzed stochastic samplers using tools like Girsanov's

Image Deblurring with Domain Generalizable Diffusion Models

This work investigates the generalization ability of icDPMs in deblurring, and proposes a simple but effective guidance to significantly alleviate artifacts, and improve the out-of-distribution performance.

A deep generative prior for high-resolution isotropic MR head slices

This work trained a StyleGAN3-T model for head MR slices for T1 and T2-weighted contrasts on public data, restricting the training corpus of this model to slices from 1mm isotropic volumes corresponding to three standard radiological views with set pre-processing steps.

A Theoretical Justification for Image Inpainting using Denoising Diffusion Probabilistic Models

A modified RePaint algorithm is proposed that provably recovers the underlying true sample and enjoys a linear rate of convergence, which is the first linear convergence result for a diffusion based image inpainting algorithm.

Intermediate Layer Optimization for Inverse Problems using Deep Generative Models

This work proposes Intermediate Layer Optimization (ILO), a novel optimization algorithm for solving inverse problems with deep generative models that outperforms state-of-the-art methods introduced in StyleGAN-2 and PULSE for a wide range of inverse problems.

Score-based Generative Modeling in Latent Space

The Latent Score-based Generative Model (LSGM) is proposed, a novel approach that trains SGMs in a latent space, relying on the variational autoencoder framework, and achieves state-of-the-art likelihood on the binarized OMNIGLOT dataset.

Provable Compressed Sensing With Generative Priors via Langevin Dynamics

This work considers the compressed sensing problem and introduces the use of stochastic gradient Langevin dynamics (SGLD) for compressed sensing with a generative prior and proves the convergence of SGLD to the true signal.

Learning to Efficiently Sample from Diffusion Probabilistic Models

This paper introduces an exact dynamic programming algorithm that finds the optimal discrete time schedules for any pre-trained DDPM, and exploits the fact that ELBO can be decomposed into separate KL terms, and discovers the time schedule that maximizes the training ELBO exactly.

Gotta Go Fast When Generating Data with Score-Based Models

This work carefully devise an SDE solver with adaptive step sizes tailored to score-based generative models piece by piece, which generates data 2 to 10 times faster than EM while achieving better or equal sample quality.

Tackling the Generative Learning Trilemma with Denoising Diffusion GANs

Denoising diffusion generative adversarial networks (denoising diffusion GANs) are introduced that model each denoising step using a multimodal conditional GAN and are the first model that reduces sampling cost in diffusion models to an extent that allows them to be applied to real-world applications inexpensively.

A Style-Based Generator Architecture for Generative Adversarial Networks

An alternative generator architecture for generative adversarial networks is proposed, borrowing from style transfer literature, that improves the state-of-the-art in terms of traditional distribution quality metrics, leads to demonstrably better interpolation properties, and also better disentangles the latent factors of variation.

Fast Mixing of Multi-Scale Langevin Dynamics under the Manifold Hypothesis

This work demonstrates how the manifold hypothesis allows for the considerable reduction of mixing time, from exponential in the ambient dimension to depending only on the (much smaller) intrinsic dimension of the data.

Large Scale GAN Training for High Fidelity Natural Image Synthesis

It is found that applying orthogonal regularization to the generator renders it amenable to a simple "truncation trick," allowing fine control over the trade-off between sample fidelity and variety by reducing the variance of the Generator's input.

Improved Techniques for Training Score-Based Generative Models

This work provides a new theoretical analysis of learning and sampling from score models in high dimensional spaces, explaining existing failure modes and motivating new solutions that generalize across datasets.