Invertible Neural Networks versus MCMC for Posterior Reconstruction in Grazing Incidence X-Ray Fluorescence

@inproceedings{Andrle2021InvertibleNN,
  title={Invertible Neural Networks versus MCMC for Posterior Reconstruction in Grazing Incidence X-Ray Fluorescence},
  author={Anna Andrle and Nando Farchmin and Paul Hagemann and Sebastian Heidenreich and Victor Soltwisch and Gabriele Steidl},
  booktitle={SSVM},
  year={2021}
}
Grazing incidence X-ray fluorescence is a non-destructive technique for analyzing the geometry and compositional parameters of nanostructures appearing e.g. in computer chips. In this paper, we propose to reconstruct the posterior parameter distribution given a noisy measurement generated by the forward model by an appropriately learned invertible neural network. This network resembles the transport map from a reference distribution to the posterior. We demonstrate by numerical comparisons that… 
Shape- and Element-Sensitive Reconstruction of Periodic Nanostructures with Grazing Incidence X-ray Fluorescence Analysis and Machine Learning
TLDR
This novel approach enables the element sensitive and destruction-free characterization of nanostructures made of silicon nitride and silicon oxide with sub-nm resolution.
Conditional Invertible Neural Networks for Medical Imaging
TLDR
This work applies generative flow-based models based on invertible neural networks to two challenging medical imaging tasks, i.e., low-dose computed tomography and accelerated medical resonance imaging, and shows that the choice of a radial distribution can improve the quality of reconstructions.
Stochastic Normalizing Flows for Inverse Problems: a Markov Chains Viewpoint
TLDR
This paper considers stochastic normalizing flows from a Markov chain point of view, replacing transition densities by general Markov kernels and establishing proofs via Radon-Nikodym derivatives which allows to incorporate distributions without densities in a sound way.
WPPNets and WPPFlows: The Power of Wasserstein Patch Priors for Superresolution
TLDR
This paper proposes to learn two kinds of neural networks in an unsupervised way based on WPP loss functions, and shows how convolutional neural networks (CNNs) can be incorporated.
Efficient approximation of high-dimensional exponentials by tensornetworks
TLDR
The composition of a generic holonomic function and a high-dimensional function corresponds to a differential equation that can be used in the method and the differential equation can be modified to adapt the norm in the a posteriori error estimates to the problem at hand.
Generalized Normalizing Flows via Markov Chains
TLDR
This chapter considers stochastic normalizing flows as a pair of Markov chains fulfilling some properties and shows how many state-of-theart models for data generation fit into this framework.
The digital transformation and novel calibration approaches
  • G. Kok
  • Computer Science
    tm - Technisches Messen
  • 2022
TLDR
How modern techniques like artificial intelligence, digital twins, digital calibration certificates and the introduction of the new definition of the SI system of units affect national metrology institutes is discussed.

References

SHOWING 1-10 OF 19 REFERENCES
Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization
TLDR
This work proposes an approach characterized by training a deep network that "pushes forward" Gaussian random inputs into the model space (representing, for example, density or velocity) as if they were sampled from the actual posterior distribution, designed to solve a variational optimization problem based on the Kullback-Leibler divergence between the posterior and the network output distributions.
Grazing incidence x-ray fluorescence based characterization of nanostructures for element sensitive profile reconstruction
For the reliable fabrication of the current and next generation of nanostructures it is essential to be able to determine their material composition and dimensional parameters. Using the grazing
Preconditioned training of normalizing flows for variational inference in inverse problems
TLDR
This work proposes a preconditioning scheme involving a conditional normalizing NF capable of sampling from a low-fidelity posterior distribution directly, and demonstrates that considerable speed-ups are achievable compared to training NFs from scratch.
Improved reconstruction of critical dimensions in extreme ultraviolet scatterometry by modeling systematic errors
Scatterometry is a non-imaging indirect optical method that is frequently used to reconstruct the critical dimensions (CD) of periodic nanostructures, e.g. structured wafer surfaces in semiconductor
Grazing incidence-x-ray fluorescence for a dimensional and compositional characterization of well-ordered 2D and 3D nanostructures.
TLDR
A soft x-ray fluorescence-based methodology that allows both a dimensional reconstruction of nanostructures and a characterization of their composition to be addressed at the same time and derives both dimensional and compositional parameters in a quantitative manner.
HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference
TLDR
The power of the HINT method for density estimation and Bayesian inference on a novel data set of 2D shapes in Fourier parameterization, which enables consistent visualization of samples for different dimensionalities, is demonstrated.
Element sensitive reconstruction of nanostructured surfaces with finite elements and grazing incidence soft X-ray fluorescence.
TLDR
It is shown that the combination of GIXRF and finite-element simulations paves the way for a versatile characterization of nanoscale-structured surfaces, capable of reconstructing the geometric line shape of a structured surface with high elemental sensitivity.
emcee: The MCMC Hammer
We introduce a stable, well tested Python implementation of the affine-invariant ensemble sampler for Markov chain Monte Carlo (MCMC) proposed by Goodman & Weare (2010). The code is open source and
Stabilizing invertible neural networks using mixture models
TLDR
This analysis indicates that changing the latent distribution from a standard normal one to a Gaussian mixture model resolves the issue of exploding Lipschitz constants and leads to significantly improved sampling quality in multimodal applications.
Analyzing Inverse Problems with Invertible Neural Networks
TLDR
It is argued that a particular class of neural networks is well suited for this task -- so-called Invertible Neural Networks (INNs), and it is verified experimentally that INNs are a powerful analysis tool to find multi-modalities in parameter space, to uncover parameter correlations, and to identify unrecoverable parameters.
...
...