# Embedded-model flows: Combining the inductive biases of model-free deep learning and explicit probabilistic modeling

@article{Silvestri2021EmbeddedmodelFC, title={Embedded-model flows: Combining the inductive biases of model-free deep learning and explicit probabilistic modeling}, author={Gianluigi Silvestri and Emily Fertig and David A. Moore and Luca Ambrogioni}, journal={ArXiv}, year={2021}, volume={abs/2110.06021} }

Normalizing flows have shown great success as general-purpose density estima-tors. However, many real world applications require the use of domain-specific knowledge, which normalizing flows cannot readily incorporate. We propose embedded-model flows (EMF), which alternate general-purpose transformations with structured layers that embed domain-specific inductive biases. These layers are automatically constructed by converting user-specified differentiable probabilistic models into equivalentâ€¦Â

## Figures and Tables from this paper

## One Citation

Physics-Integrated Variational Autoencoders for Robust and Interpretable Generative Modeling

- Computer ScienceNeurIPS
- 2021

This work proposes a VAE architecture in which a part of the latent space is grounded by physics and couple it with a set of regularizers that control the effect of the learned components and preserve the semantics of the physics-based latent variables as intended.

## References

SHOWING 1-10 OF 46 REFERENCES

Joint Distributions for TensorFlow Probability

- Computer ScienceArXiv
- 2020

JointDistributions is described, a family of declarative representations of directed graphical models in TensorFlow Probability that are usable by inference algorithms.

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

- Computer Science, MathematicsICLR
- 2019

This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.

Pyro: Deep Universal Probabilistic Programming

- Computer ScienceJ. Mach. Learn. Res.
- 2019

Pyro uses stochastic variational inference algorithms and probability distributions built on top of PyTorch, a modern GPU-accelerated deep learning framework to accommodate complex or model-specific algorithmic behavior.

Automatic Differentiation Variational Inference

- Computer ScienceJ. Mach. Learn. Res.
- 2017

Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.

Deep Probabilistic Programming

- Computer ScienceICLR
- 2017

Edward, a Turing-complete probabilistic programming language, is proposed, which makes it easy to fit the same model using a variety of composable inference methods, ranging from point estimation to variational inference to MCMC.

Improved Variational Inference with Inverse Autoregressive Flow

- Computer ScienceNIPS 2016
- 2017

A new type of normalizing flow, inverse autoregressive flow (IAF), is proposed that, in contrast to earlier published flows, scales well to high-dimensional latent spaces and significantly improves upon diagonal Gaussian approximate posteriors.

Masked Autoregressive Flow for Density Estimation

- Mathematics, Computer ScienceNIPS
- 2017

This work describes an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data, which is called Masked Autoregressive Flow.

Automatic structured variational inference

- Computer ScienceAISTATS
- 2021

This work introduced a fully automatic method for constructing structured variational families inspired to the closed-form update in conjugate models and validate this automatic variational method on a wide range of high dimensional inference problems including deep learning components.

Automatic variational inference with cascading flows

- Computer ScienceICML
- 2021

Cascading flows are introduced, a new family of variational programs that can be constructed automatically from an input probabilistic program and can also be amortized automatically that have much higher performance than both normalizing flows and ASVI in a large set of structured inference problems.

E(n) Equivariant Normalizing Flows for Molecule Generation in 3D

- Computer ScienceArXiv
- 2021

It is demonstrated that E-NFs considerably outperform baselines and existing methods from the literature on particle systems such as DW4 and LJ13, and on molecules from QM9 in terms of log-likelihood.