Inference and De-Noising of Non-Gaussian Particle Distribution Functions: A Generative Modeling Approach

@inproceedings{Donaghy2021InferenceAD,
  title={Inference and De-Noising of Non-Gaussian Particle Distribution Functions: A Generative Modeling Approach},
  author={John Donaghy and Kai Germaschewski},
  booktitle={LOD},
  year={2021}
}
The particle-in-cell numerical method of plasma physics balances a trade-off between computational cost and intrinsic noise. Inference on data produced by these simulations generally consists of binning the data to recover the particle distribution function, from which physical processes may be investigated. In addition to containing noise, the distribution function is temporally dynamic and can be non-gaussian and multi-modal, making the task of modeling it difficult. Here we demonstrate the… 

References

SHOWING 1-10 OF 21 REFERENCES
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Deep Unsupervised Clustering with Gaussian Mixture Variational Autoencoders
TLDR
It is shown that a heuristic called minimum information constraint that has been shown to mitigate this effect in VAEs can also be applied to improve unsupervised clustering performance with this variant of the variational autoencoder model with a Gaussian mixture as a prior distribution.
Spatial coupling of gyrokinetic simulations, a generalized scheme based on first-principles
TLDR
A scheme that spatially couples two gyrokinetic codes using first-principles, using a five-dimensional (5D) grid to communicate the distribution function between the two codes.
NICE: Non-linear Independent Components Estimation
We propose a deep learning framework for modeling complex high-dimensional densities called Non-linear Independent Component Estimation (NICE). It is based on the idea that a good representation is
Density estimation using Real NVP
TLDR
This work extends the space of probabilistic models using real-valued non-volume preserving (real NVP) transformations, a set of powerful invertible and learnable transformations, resulting in an unsupervised learning algorithm with exact log-likelihood computation, exact sampling, exact inference of latent variables, and an interpretable latent space.
A tight-coupling scheme sharing minimum information across a spatial interface between gyrokinetic turbulence codes
TLDR
A new scheme that tightly couples kinetic turbulence codes across a spatial interface is introduced and it is found that the use of a composite kinetic distribution function and fields with global boundary conditions as if the coupled code were one, makes the coupling problem tractable.
Masked Autoregressive Flow for Density Estimation
TLDR
This work describes an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data, which is called Masked Autoregressive Flow.
First coupled GENE–XGC microturbulence simulations
Covering the core and the edge region of a tokamak, respectively, the two gyrokinetic turbulence codes Gyrokinetic Electromagnetic Numerical Experiment (GENE) and X-point Gyrokinetic Code (XGC) have
Glow: Generative Flow with Invertible 1x1 Convolutions
TLDR
Glow, a simple type of generative flow using an invertible 1x1 convolution, is proposed, demonstrating that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images.
...
1
2
3
...