# Learned imaging with constraints and uncertainty quantification

@article{Herrmann2019LearnedIW, title={Learned imaging with constraints and uncertainty quantification}, author={F. Herrmann and Ali Siahkoohi and Gabrio Rizzuti}, journal={ArXiv}, year={2019}, volume={abs/1909.06473} }

We outline new approaches to incorporate ideas from deep learning into wave-based least-squares imaging. The aim, and main contribution of this work, is the combination of handcrafted constraints with deep convolutional neural networks, as a way to harness their remarkable ease of generating natural images. The mathematical basis underlying our method is the expectation-maximization framework, where data are divided in batches and coupled to additional "latent" unknowns. These unknowns are…

## 23 Citations

### Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization

- Computer ScienceArXiv
- 2020

This work proposes an approach characterized by training a deep network that "pushes forward" Gaussian random inputs into the model space (representing, for example, density or velocity) as if they were sampled from the actual posterior distribution, designed to solve a variational optimization problem based on the Kullback-Leibler divergence between the posterior and the network output distributions.

### Weak deep priors for seismic imaging

- GeologySEG Technical Program Expanded Abstracts 2020
- 2020

A weak version of deep priors is proposed, which consists of relaxing the requirement that reflectivity models must lie in the network range, and letting the unknowns deviate from the network output according to a Gaussian distribution, which is the affordable computational budget in large-scale imaging problems.

### A deep-learning based Bayesian approach to seismic imaging and uncertainty quantification

- Computer ScienceArXiv
- 2020

This work proposes to use the functional form of a randomly initialized convolutional neural network as an implicit structured prior, which is shown to promote natural images and excludes images with unnatural noise.

### Uncertainty quantification in imaging and automatic horizon tracking: a Bayesian deep-prior based approach

- MathematicsSEG Technical Program Expanded Abstracts 2020
- 2020

This work is fundamentally based on a special reparameterization of reflectivity, known as "deep prior", and verified that the estimated confidence intervals for the horizon tracking coincide with geologically complex regions, such as faults.

### Preconditioned training of normalizing flows for variational inference in inverse problems

- Computer ScienceArXiv
- 2021

A preconditioning scheme involving a conditional normalizing flow capable of sampling from a low-fidelity posterior distribution directly, used to speed up the training of the high-f fidelity objective involving minimization of the Kullback-Leibler divergence between the predicted and the desired high- fidelity posterior density for indirect measurements at hand.

### Deep Bayesian inference for seismic imaging with tasks

- Mathematics
- 2021

A systematic approach is proposed to translate uncertainty due to noise in the data to conﬁdence intervals of automatically tracked horizons in the image to handle large scale Bayesian inference problems with computationally expensive forward operators as in seismic imaging.

### Post-Stack Inversion with Uncertainty Estimation through Bayesian Deep Image Prior

- Geology, Mathematics82nd EAGE Annual Conference & Exhibition
- 2021

Post-stack inversion is a well-known and very much studied geophysical deconvolution problem. As most of geophysical inverse problems, post-stack inversion is ill-posed and ill-conditioned, leading…

### Wave-equation-based inversion with amortized variational Bayesian inference

- Geology
- 2022

Solving inverse problems involving measurement noise and modeling errors re-quires regularization in order to avoid data overﬁt. Geophysical inverse problems, in which the Earth’s highly…

### Reliable amortized variational inference with physics-based latent distribution correction

- Computer ScienceGEOPHYSICS
- 2023

This work aims to increase the resilience of amortized variational inference in the presence of moderate data distribution shifts via a correction to the conditional normalizing flow’s latent distribution that improves the approximation to the posterior distribution for the data at hand.

### Point-to-set distance functions for weakly supervised segmentation

- Environmental ScienceArXiv
- 2020

A new algorithm is presented to include object size information via constraints on the network output, implemented via projection-based point-to-set distance functions, and avoids the need to adapt penalty functions to different constraints, as well as issues related to constraining properties typically associated with non-differentiable functions.

## 14 References

### Compressed Sensing with Deep Image Prior and Learned Regularization

- Computer ScienceArXiv
- 2018

It is proved that single-layer DIP networks with constant fraction over-parameterization will perfectly fit any signal through gradient descent, despite being a non-convex problem, which provides justification for early stopping.

### Regularization by Architecture: A Deep Prior Approach for Inverse Problems

- MathematicsJournal of Mathematical Imaging and Vision
- 2019

The main contribution is to introduce the idea of viewing these approaches as the optimization of Tikhonov functionals rather than optimizing networks.

### Parametric convolutional neural network-domain full-waveform inversion

- GeologyGEOPHYSICS
- 2019

Reparameterization of the initial velocity model is performed, by the weights in a convolutional neural network (CNN), to automatically capture the salient features in the initial model, as a priori information.

### Stochastic Seismic Waveform Inversion Using Generative Adversarial Networks as a Geological Prior

- GeologyMathematical Geosciences
- 2019

It is shown that approximate MALA sampling allows efficient Bayesian inversion of model parameters obtained from a prior represented by a deep generative model, obtaining a diverse set of realizations that reflect the observed seismic response.

### Uncertainty quantification for inverse problems with weak partial-differential-equation constraints

- Mathematics, Computer ScienceGEOPHYSICS
- 2018

This work considers inverse problems with partial-differential-equation (PDE) constraints, which are applicable to many seismic problems, and develops a bilinear posterior distribution with weak PDE constraints that is more amenable to uncertainty quantification because of its special structure.

### Projection methods and applications for seismic nonlinear inverse problems with multiple constraints

- GeologyGEOPHYSICS
- 2019

This work has developed an optimization framework that allows us to add multiple pieces of prior information in the form of constraints to full-waveform inversion (FWI), and obtains better FWI results compared with a quadratic penalty method.

### Alternating Back-Propagation for Generator Network

- Computer ScienceAAAI
- 2017

It is shown that the alternating back-propagation algorithm can learn realistic generator models of natural images, video sequences, and sounds and can also be used to learn from incomplete or indirect training data.

### Bayesian Learning via Stochastic Gradient Langevin Dynamics

- Computer ScienceICML
- 2011

In this paper we propose a new framework for learning from large scale datasets based on iterative learning from small mini-batches. By adding the right amount of noise to a standard stochastic…

### Total Variation Regularization Strategies in Full-Waveform Inversion

- GeologySIAM J. Imaging Sci.
- 2018

An extended full-waveform inversion formulation that includes general convex constraints on the model is proposed that allows the model to steer free from parasitic local minima while keeping the estimated physical parameters laterally continuous and in a physically realistic range.

### Entropy-SGD: Biasing Gradient Descent Into Wide Valleys

- Computer ScienceICLR
- 2017

This paper proposes a new optimization algorithm called Entropy-SGD for training deep neural networks that is motivated by the local geometry of the energy landscape and compares favorably to state-of-the-art techniques in terms of generalization error and training time.