• Corpus ID: 231979049

E(n) Equivariant Graph Neural Networks

  title={E(n) Equivariant Graph Neural Networks},
  author={Victor Garcia Satorras and Emiel Hoogeboom and Max Welling},
  booktitle={International Conference on Machine Learning},
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E( n ) Equivariant Graph Neural Networks (EGNNs). In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance. In addition, whereas existing methods are limited to equivariance on 3 dimensional spaces, our model is easily… 

Figures and Tables from this paper

Beyond permutation equivariance in graph networks

We introduce a novel architecture for graph networks which is equivariant to the Euclidean group in n-dimensions, and is additionally able to deal with affine transformations. Our model is designed

E(n) Equivariant Normalizing Flows

This paper introduces a generative model equivariant to Euclidean symmetries: E(n) Equivariant Normalizing Flows (E-NFs), the first flow that jointly generates molecule features and positions in 3D.

Data efficiency in graph networks through equivariance

It is shown that, learning on a minimal amount of data, the architecture proposed can perfectly generalise to unseen data in a synthetic problem, while much more training data are required from a standard model to reach comparable performance.

Symmetry-driven graph neural networks

Two graph network architectures that are equivariant to several types of transformations affecting the node coordinates are introduced that can be vastly more data efficient with respect to classical graph architectures, intrinsically equipped with a better inductive bias and better at generalising.

E(n) Equivariant Normalizing Flows for Molecule Generation in 3D

It is demonstrated that E-NFs considerably outperform baselines and existing methods from the literature on particle systems such as DW4 and LJ13, and on molecules from QM9 in terms of log-likelihood.

UNiTE: Unitary N-body Tensor Equivariant Network with Applications to Quantum Chemistry

This work proposes unitary N -body tensor equivariant neural network (UNiTE), an architecture for a general class of symmetric tensors called N - body tensors, and introduces a normalization method, viz., Equivariant Normalization, to improve generalization of the neural network while preserving symmetry.

Equivariant Graph Neural Networks for 3D Macromolecular Structure

This work extends recent work on geometric vector perceptrons and applies equivariant graph neural networks to a wide range of tasks from structural biology and demonstrates that transfer learning can improve performance in learning from macromolecular structure.

ChebLieNet: Invariant Spectral Graph NNs Turned Equivariant by Riemannian Geometry on Lie Groups

The existence of (data-dependent) sweet spots for anisotropic parameters on CIFAR10 is empirically proved, and ChebLieNet, a group-equivariant method on (anisotropic) manifolds is introduced, opening the doors to a better understanding of anisotropies.

Frame Averaging for Invariant and Equivariant Network Design

Many machine learning tasks involve learning functions that are known to be invariant or equivariant to certain symmetries of the input data. However, it is often challenging to design neural network

Self-Supervised Graph Representation Learning via Topology Transformations

We present the Topology Transformation Equivariant Representation learning, a general paradigm of self-supervised learning for node representations of graph data to enable the wide applicability of



Group Equivariant Convolutional Networks

Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries and achieves state of the art results on CI- FAR10 and rotated MNIST.

Isometric Transformation Invariant and Equivariant Graph Convolutional Networks

This paper proposes a set of transformation invariant and equivariant models based on graph convolutional networks (GCNs), called IsoGCNs, and demonstrates that the proposed model outperforms state-of-the-art methods on tasks related with geometrical and physical data.

Generalizing Convolutional Neural Networks for Equivariance to Lie Groups on Arbitrary Continuous Data

A general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group with a surjective exponential map is proposed, enabling rapid prototyping and exact conservation of linear and angular momentum.

Spectral Networks and Locally Connected Networks on Graphs

This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.

General E(2)-Equivariant Steerable CNNs

The theory of Steerable CNNs yields constraints on the convolution kernels which depend on group representations describing the transformation laws of feature spaces, and it is shown that these constraints for arbitrary group representations can be reduced to constraints under irreducible representations.

Covariant Compositional Networks For Learning Graphs

Covariant Compositional Networks are proposed, which achieve covariance by making each activation transform according to a tensor representation of the permutation group, and derive the corresponding tensor aggregation rules that each neuron must implement.

Variational Graph Auto-Encoders

The variational graph auto-encoder (VGAE) is introduced, a framework for unsupervised learning on graph-structured data based on the variational auto- Encoder (VAE) that can naturally incorporate node features, which significantly improves predictive performance on a number of benchmark datasets.

Relevance of Rotationally Equivariant Convolutions for Predicting Molecular Properties

This paper finds that for fixed network depth, adding angular features improves the accuracy on most targets, and beats previous state-of-the-art results on the global electronic properties dipole moment, isotropic polarizability, and electronic spatial extent.

Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.

Graphite: Iterative Generative Modeling of Graphs

This work proposes Graphite, an algorithmic framework for unsupervised learning of representations over nodes in large graphs using deep latent variable generative models, parameterizes variational autoencoders (VAE) with graph neural networks, and uses a novel iterative graph refinement strategy inspired by low-rank approximations for decoding.