A probabilistic generative model for semi-supervised training of coarse-grained surrogates and enforcing physical constraints through virtual observables

@article{Rixner2021APG,
  title={A probabilistic generative model for semi-supervised training of coarse-grained surrogates and enforcing physical constraints through virtual observables},
  author={Maximilian Rixner and Phaedon-Stelios Koutsourelakis},
  journal={ArXiv},
  year={2021},
  volume={abs/2006.01789}
}
Physics-Aware, Deep Probabilistic Modeling of Multiscale Dynamics in the Small Data Regime
TLDR
A probabilistic perspective that simultaneously identifies predictive, lower-dimensional coarse-grained (CG) variables as well as their dynamics and demonstrates how domain knowledge that is very often available in the form of physical constraints can be incorporated with the novel concept of virtual observables.
Physics-Integrated Variational Autoencoders for Robust and Interpretable Generative Modeling
TLDR
This work introduces an architecture of variational autoencoders (VAEs) in which a part of the latent space is grounded by physics, and proposes a regularized learning method that controls the effect of the trainable components and preserves the semantics of the physics-based latent variables as intended.

References

SHOWING 1-10 OF 102 REFERENCES
Physics-Constrained, Data-Driven Discovery of Coarse-Grained Dynamics
TLDR
This paper advocates the paradigm of data-driven discovery for extract- ing governing equations by employing fine-scale simulation data and advocates a sparse Bayesian learning perspective which avoids overfitting and reveals the most salient features in the CG evolution law.
Conditional deep surrogate models for stochastic, high-dimensional, and multi-fidelity systems
TLDR
A probabilistic deep learning methodology that enables the construction of predictive data-driven surrogates for stochastic systems and can accommodate high-dimensional inputs and outputs and are able to return predictions with quantified uncertainty is presented.
Predictive Collective Variable Discovery with Deep Bayesian Models
TLDR
The discovery of CVs is formulated as a Bayesian inference problem and the CVs are considered as hidden generators of the full-atomistic trajectory to generate samples of the fine-scale atomistic configurations using limited training data.
Label-Free Supervision of Neural Networks with Physics and Domain Knowledge
TLDR
This work introduces a new approach to supervising neural networks by specifying constraints that should hold over the output space, rather than direct examples of input-output pairs, derived from prior domain knowledge.
Constraint-aware neural networks for Riemann problems
...
...