# Equivariant Flows: exact likelihood generative learning for symmetric densities

@inproceedings{Khler2020EquivariantFE, title={Equivariant Flows: exact likelihood generative learning for symmetric densities}, author={Jonas K{\"o}hler and Leon Klein and Frank No{\'e}}, booktitle={ICML}, year={2020} }

Normalizing flows are exact-likelihood generative neural networks which approximately transform samples from a simple prior distribution to samples of the probability distribution of interest. Recent work showed that such generative models can be utilized in statistical mechanics to sample equilibrium states of many-body systems in physics and chemistry. To scale and generalize these results, it is essential that the natural symmetries in the probability density - in physics defined by the…

## 67 Citations

Preserving Properties of Neural Networks by Perturbative Updates

- Computer Science
- 2020

This work presents a novel, general approach to preserve network properties by using parameterized perturbations, and shows how such invertible blocks improve mode separation when applied to normalizing flows and Boltzmann generators.

Training Neural Networks with Property-Preserving Parameter Perturbations

- Computer ScienceArXiv
- 2020

This work presents a novel, general approach of preserving matrix properties by using parameterized perturbations in lieu of directly optimizing the network parameters, and shows how such invertible blocks improve the mixing of coupling layers and thus the mode separation of the resulting normalizing flows.

Equivariant Normalizing Flows for Point Processes and Sets

- Computer ScienceArXiv
- 2020

The proposed model - CONFET, based on continuous normalizing flows, allows arbitrary interactions between points while having tractable likelihood and shows improved performance on various real and synthetic datasets.

Temperature-steerable flows

- Physics
- 2020

Boltzmann generators approach the sampling problem in many-body physics by combining a normalizing flow and a statistical reweighting method to generate samples of a physical system's equilibrium…

Improved Variational Bayesian Phylogenetic Inference with Normalizing Flows

- Computer ScienceNeurIPS
- 2020

A new type ofvariational Bayesian phylogenetic inference, VBPI-NF, is proposed, as a first step to empower phylogenetic posterior estimation with deep learning techniques, and significantly improves upon the vanilla VB PI on a benchmark of challenging real data Bayesian evolutionary inference problems.

Training Invertible Linear Layers through Rank-One Perturbations.

- Computer Science
- 2020

This work presents a novel approach for training invertible linear layers by train rank-one perturbations and add them to the actual weight matrices infrequently, which allows keeping track of inverses and determinants without ever explicitly computing them.

E(n) Equivariant Graph Neural Networks

- Computer ScienceICML
- 2021

This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reﬂections and permutations called E( n ) Equivariant Graph Neural Networks (EGNNs), which does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.

Beyond permutation equivariance in graph networks

- Computer Science, MathematicsArXiv
- 2021

We introduce a novel architecture for graph networks which is equivariant to the Euclidean group in n-dimensions, and is additionally able to deal with affine transformations. Our model is designed…

Machine Learning Force Fields

- Computer Science, ChemistryChemical reviews
- 2021

An overview of applications of ML-FFs and the chemical insights that can be obtained from them is given, and a step-by-step guide for constructing and testing them from scratch is given.

Path-Gradient Estimators for Continuous Normalizing Flows

- Computer ScienceICML
- 2022

This work proposes a path-gradient estimator for the considerably more expressive variational family of continuous normalizing flows and outlines an efficient algorithm to calculate this estimator and establishes its superior performance empirically.

## References

SHOWING 1-10 OF 51 REFERENCES

Equivariant Flows: sampling configurations for multi-body systems with symmetric energies

- Mathematics, PhysicsArXiv
- 2019

It is demonstrated that a BG that is equivariant with respect to rotations and particle permutations can generalize to sampling nontrivially new configurations where a nonequivariant BG cannot.

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

- Computer Science, MathematicsICLR
- 2019

This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.

Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning

- Computer ScienceScience
- 2019

Boltzmann generators are trained on the energy function of a many-body system and learn to provide unbiased, one-shot samples from its equilibrium state and can be trained to directly generate independent samples of low-energy structures of condensed-matter systems and protein molecules.

Residual Flows for Invertible Generative Modeling

- MathematicsNeurIPS
- 2019

The resulting approach, called Residual Flows, achieves state-of-the-art performance on density estimation amongst flow-based models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.

Hamiltonian Generative Networks

- Computer Science, PhysicsICLR
- 2020

This work introduces the Hamiltonian Generative Network (HGN), the first approach capable of consistently learning Hamiltonian dynamics from high-dimensional observations (such as images) without restrictive domain assumptions, and demonstrates how a simple modification of the network architecture turns HGN into a powerful normalising flow model, called Neural Hamiltonian Flow (NHF), that usesHamiltonian dynamics to model expressive densities.

Neural Network Renormalization Group

- Computer SciencePhysical review letters
- 2018

We present a variational renormalization group (RG) approach based on a reversible generative model with hierarchical architecture. The model performs hierarchical change-of-variables transformations…

Variational Inference with Normalizing Flows

- Computer Science, MathematicsICML
- 2015

It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.

Monge-Ampère Flow for Generative Modeling

- Computer ScienceArXiv
- 2018

This approach brings insights and techniques from Monge-Ampere equation, optimal transport, and fluid dynamics into reversible flow-based generative models and applies the approach to unsupervised density estimation of the MNIST dataset and variational calculation of the two-dimensional Ising model at the critical point.

Neural Importance Sampling

- Computer ScienceACM Trans. Graph.
- 2019

This work introduces piecewise-polynomial coupling transforms that greatly increase the modeling power of individual coupling layers and derives a gradient-descent-based optimization for the Kullback-Leibler and the χ2 divergence for the specific application of Monte Carlo integration with unnormalized stochastic estimates of the target distribution.

Improving Variational Auto-Encoders using Householder Flow Improving Variational Auto-Encoders using Householder Flow

- Computer Science
- 2016

This paper proposes a volume-preserving VAE that uses a series of Householder transformations and shows empirically on MNIST dataset and histopathology data that the proposedow allows to obtain more ﬂexible variational posterior and highly competitive results comparing to other normalizingﬂows.