• Corpus ID: 237605555

Stochastic Normalizing Flows for Inverse Problems: a Markov Chains Viewpoint

@article{Hagemann2021StochasticNF,
  title={Stochastic Normalizing Flows for Inverse Problems: a Markov Chains Viewpoint},
  author={Paul Hagemann and Johannes Hertrich and Gabriele Steidl},
  journal={ArXiv},
  year={2021},
  volume={abs/2109.11375}
}
To overcome topological constraints and improve the expressiveness of normalizing flow architectures, Wu, Köhler and Noé introduced stochastic normalizing flows which combine deterministic, learnable flow transformations with stochastic sampling methods. In this paper, we consider stochastic normalizing flows from a Markov chain point of view. In particular, we replace transition densities by general Markov kernels and establish proofs via Radon-Nikodym derivatives which allows to incorporate… 

Figures from this paper

Generalized Normalizing Flows via Markov Chains
TLDR
This chapter considers stochastic normalizing flows as a pair of Markov chains fulfilling some properties and shows how many state-of-theart models for data generation fit into this framework.
A Unified Approach to Variational Autoencoders and Stochastic Normalizing Flows via Markov Chains
TLDR
This paper considers stochastic normalizing flows as a pair of Markov chains fulfilling some properties and shows that many state-of-the-art models for data generation fit into this framework.
Continuous Generative Neural Networks
TLDR
This work presents conditions on the convolutional and nonlinearity and on the non linearity that guarantee that a CGNN is injective, and allows for deriving Lipschitz stability estimates for (possibly nonlinear) in-dimensional inverse problems with unknowns belonging to the manifold generated by aCGNN.
PatchNR: Learning from Small Data by Patch Normalizing Flow Regularization
TLDR
By investigating the distribution of patches versus those of the whole image class, it is proved that the variational model is indeed a MAP approach and the model can be generalized to conditional patchNRs, if additional supervised information is available.
Conditional Invertible Neural Networks for Medical Imaging
TLDR
This work applies generative flow-based models based on invertible neural networks to two challenging medical imaging tasks, i.e., low-dose computed tomography and accelerated medical resonance imaging, and shows that the choice of a radial distribution can improve the quality of reconstructions.
WPPNets: Unsupervised CNN Training with Wasserstein Patch Priors for Image Superresolution
TLDR
WPPNets are introduced, which are CNNs trained by a new unsupervised loss function for image superresolution of materials microstructures which enables them to use in real-world applications, where neither a large database of registered data nor the exact forward operator are given.
WPPNets and WPPFlows: The Power of Wasserstein Patch Priors for Superresolution
TLDR
This paper proposes to learn two kinds of neural networks in an unsupervised way based on WPP loss functions, and shows how convolutional neural networks (CNNs) can be incorporated.

References

SHOWING 1-10 OF 60 REFERENCES
Stochastic Normalizing Flows
TLDR
Stochastic Normalizing Flows (SNF) is proposed -- an arbitrary sequence of deterministic invertible functions and stochastic sampling blocks that illustrate the representational power, sampling efficiency and asymptotic correctness of SNFs on several benchmarks including applications to sampling molecular systems in equilibrium.
SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows
TLDR
SurVAE Flows bridge the gap between normalizing flows and VAEs with surjective transformations, wherein the transformations are deterministic in one direction -- thereby allowing exact likelihood computation, and stochastic in the reverse direction -- hence providing a lower bound on the corresponding likelihood.
Residual Flows for Invertible Generative Modeling
TLDR
The resulting approach, called Residual Flows, achieves state-of-the-art performance on density estimation amongst flow-based models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.
Variational Inference with Normalizing Flows
TLDR
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
Neural Spline Flows
TLDR
This work proposes a fully-differentiable module based on monotonic rational-quadratic splines, which enhances the flexibility of both coupling and autoregressive transforms while retaining analytic invertibility, and demonstrates that neural spline flows improve density estimation, variational inference, and generative modeling of images.
Composing Normalizing Flows for Inverse Problems
TLDR
This work proposes a framework for approximate inference that estimates the target conditional as a composition of two flow models that leads to a stable variational inference training procedure that avoids adversarial training.
Deep Unsupervised Learning using Nonequilibrium Thermodynamics
TLDR
This work develops an approach to systematically and slowly destroy structure in a data distribution through an iterative forward diffusion process, then learns a reverse diffusion process that restores structure in data, yielding a highly flexible and tractable generative model of the data.
Annealed Flow Transport Monte Carlo
TLDR
A novel Monte Carlo algorithm that builds upon AIS and SMC and combines them with normalizing flows (NFs) for improved performance is proposed and a continuous-time scaling limit of the population version of AFT is given by a Feynman–Kac measure.
Riemann manifold Langevin and Hamiltonian Monte Carlo methods
TLDR
The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.
Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows
TLDR
It is proved that a flow must become arbitrarily numerically noninvertible in order to approximate the target closely, and proposed Continuously Indexed Flows (CIFs) are proposed, which replace the single bijection used by normalising flows with a continuously indexed family of bijections.
...
...