# Embedded-model flows: Combining the inductive biases of model-free deep learning and explicit probabilistic modeling

@article{Silvestri2021EmbeddedmodelFC, title={Embedded-model flows: Combining the inductive biases of model-free deep learning and explicit probabilistic modeling}, author={Gianluigi Silvestri and Emily Fertig and David A. Moore and Luca Ambrogioni}, journal={ArXiv}, year={2021}, volume={abs/2110.06021} }

Normalizing flows have shown great success as general-purpose density estimators. However, many real world applications require the use of domain-specific knowledge, which normalizing flows cannot readily incorporate. We propose embedded-model flows (EMF), which alternate general-purpose transformations with structured layers that embed domain-specific inductive biases. These layers are automatically constructed by converting user-specified differentiable probabilistic models into equivalent…

## Figures and Tables from this paper

## One Citation

Physics-Integrated Variational Autoencoders for Robust and Interpretable Generative Modeling

- Computer Science, MathematicsArXiv
- 2021

This work proposes a VAE architecture in which a part of the latent space is grounded by physics and couple it with a set of regularizers that control the effect of the learned components and preserve the semantics of the physics-based latent variables as intended.

## References

SHOWING 1-10 OF 45 REFERENCES

Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models

- Computer ScienceAISTATS
- 2020

By expressing the structure inversion as a compilation pass in a probabilistic programming language, this work is able to apply it in a novel way to models as complex as convolutional neural networks.

Automatic variational inference with cascading flows

- Computer Science, MathematicsICML
- 2021

Cascading flows are introduced, a new family of variational programs that can be constructed automatically from an input probabilistic program and can also be amortized automatically that have much higher performance than both normalizing flows and ASVI in a large set of structured inference problems.

Auto-Encoding Variational Bayes

- Mathematics, Computer ScienceICLR
- 2014

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

NICE: Non-linear Independent Components Estimation

- Computer Science, MathematicsICLR
- 2015

We propose a deep learning framework for modeling complex high-dimensional densities called Non-linear Independent Component Estimation (NICE). It is based on the idea that a good representation is…

Glow: Generative Flow with Invertible 1x1 Convolutions

- Computer Science, MathematicsNeurIPS
- 2018

Glow, a simple type of generative flow using an invertible 1x1 convolution, is proposed, demonstrating that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images.

Variational Inference with Normalizing Flows

- Computer Science, MathematicsICML
- 2015

It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

- Computer Science, MathematicsICLR
- 2019

This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.

Deep Probabilistic Programming

- Computer Science, MathematicsICLR
- 2017

Edward, a Turing-complete probabilistic programming language, is proposed, which makes it easy to fit the same model using a variety of composable inference methods, ranging from point estimation to variational inference to MCMC.

Improved Variational Inference with Inverse Autoregressive Flow

- Mathematics, Computer ScienceNIPS 2016
- 2017

A new type of normalizing flow, inverse autoregressive flow (IAF), is proposed that, in contrast to earlier published flows, scales well to high-dimensional latent spaces and significantly improves upon diagonal Gaussian approximate posteriors.

Adam: A Method for Stochastic Optimization

- Computer Science, MathematicsICLR
- 2015

This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.