• Corpus ID: 235351958

Equivariant Manifold Flows

@inproceedings{Katsman2021EquivariantMF,
  title={Equivariant Manifold Flows},
  author={Isay Katsman and Aaron Lou and Derek Lim and Qingxuan Jiang and Ser-Nam Lim and Christopher De Sa},
  booktitle={NeurIPS},
  year={2021}
}
Tractably modelling distributions over manifolds has long been an important goal in the natural sciences. Recent work has focused on developing general machine learning models to learn such distributions. However, for many applications these distributions must respect manifold symmetries—a trait which most previous models disregard. In this paper, we lay the theoretical foundations for learning symmetry-invariant distributions on arbitrary manifolds via equivariant manifold flows. We… 

Figures and Tables from this paper

Implicit Riemannian Concave Potential Maps

TLDR
This work combines ideas from implicit neural layers and optimal transport theory to propose a generalisation of existing work on exponential map flows, Implicit Riemannian Concave Potential Maps, IRCPMs, which have some nice properties such as simplicity of incorporating symmetries and are less expensive than ODE-flows.

Equivariant Discrete Normalizing Flows

TLDR
This paper theoretically proves the existence of an equivariant map for compact groups whose actions are on compact spaces and construction of G-Residual Flows are proved to be universal, in the sense that an G-equivariant diffeomorphism can be exactly mapped by a G-residual flow.

Equivariant Finite Normalizing Flows

Generative modelling seeks to uncover the underlying factors that give rise to observed data that can often be modeled as the natural symmetries that manifest themselves through in-variances and

R IEMANNIAN N EURAL SDE: L EARNING S TOCHASTIC R EPRESENTATIONS ON M ANIFOLDS

In recent years, the neural stochastic differential equation (NSDE) has gained attention in modeling stochastic representations, while NSDE brings a great success in various types of applications.

Symmetry-Based Representations for Artificial and Biological General Intelligence

TLDR
It is argued that symmetry transformations are a fundamental principle that can guide the search for what makes a good representation, and may be an important general framework that determines the structure of the universe, constrains the nature of natural tasks and consequently shapes both biological and artificial intelligence.

E NERGY -I NSPIRED M OLECULAR C ONFORMATION O PTIMIZATION

TLDR
A neural energy minimization formulation that casts the prediction problem into an unrolled optimization process, where a neural network is parametrized to learn the gradient of an implicit conformational energy landscape, can not only reinterpret and unify many of the existing models but also derive new variants of SE(3)-equivariant neural networks in a principled manner.

References

SHOWING 1-10 OF 53 REFERENCES

Equivariant Hamiltonian Flows

This paper introduces equivariant hamiltonian flows, a method for learning expressive densities that are invariant with respect to a known Lie-algebra of local symmetry transformations while

Neural Ordinary Differential Equations on Manifolds

TLDR
It is shown how vector fields provide a general framework for parameterizing a flexible class of invertible mapping on these spaces and it is illustrated how gradient based learning can be performed.

Riemannian Continuous Normalizing Flows

TLDR
Riemannian continuous normalizing flows is introduced, a model which admits the parametrization of flexible probability measures on smooth manifolds by defining flows as the solution to ordinary differential equations.

Neural Manifold Ordinary Differential Equations

TLDR
This paper introduces Neural Manifolds Ordinary Differential Equations, a manifold generalization of Neural ODEs, which enables the construction of Manifold Continuous Normalizing Flows (MCNFs), and finds that leveraging continuous manifold dynamics produces a marked improvement for both density estimation and downstream tasks.

Mixed-curvature Variational Autoencoders

TLDR
A Mixed-curvature Variational Autoencoder is developed, an efficient way to train a VAE whose latent space is a product of constant curvature Riemannian manifolds, where the per-component curvature is fixed or learnable.

Equivariant Flows: exact likelihood generative learning for symmetric densities

TLDR
This work provides a theoretical sufficient criterion showing that the distribution generated by equivariant normalizing flows is invariant with respect to these symmetries by design, and proposes building blocks for flows which preserve symmetry which are usually found in physical/chemical many-body particle systems.

Normalizing Flows on Tori and Spheres

TLDR
This paper proposes and compares expressive and numerically stable flows on spaces with more complex geometries, such as tori or spheres, and builds recursively on the dimension of the space, starting from flows on circles, closed intervals or spheres.

On the Generalization of Equivariance and Convolution in Neural Networks to the Action of Compact Groups

TLDR
It is proved that (given some natural constraints) convolutional structure is not just a sufficient, but also a necessary condition for equivariance to the action of a compact group.

Gauge Equivariant Convolutional Networks and the Icosahedral CNN

TLDR
Gauge equivariant convolution using a single conv2d call is demonstrated, making it a highly scalable and practical alternative to Spherical CNNs and demonstrating substantial improvements over previous methods on the task of segmenting omnidirectional images and global climate patterns.

Generalizing Convolutional Neural Networks for Equivariance to Lie Groups on Arbitrary Continuous Data

TLDR
A general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group with a surjective exponential map is proposed, enabling rapid prototyping and exact conservation of linear and angular momentum.
...