• Corpus ID: 238407870

Geometric and Physical Quantities improve E(3) Equivariant Message Passing

@article{Brandstetter2022GeometricAP,
  title={Geometric and Physical Quantities improve E(3) Equivariant Message Passing},
  author={Johannes Brandstetter and Rob Hesselink and Elise van der Pol and Erik J. Bekkers and Max Welling},
  journal={ArXiv},
  year={2022},
  volume={abs/2110.02905}
}
Including covariant information, such as position, force, velocity or spin is important in many tasks in computational physics and chemistry. We introduce Steerable E( 3 ) Equivariant Graph Neural Networks (SEGNNs) that generalise equivariant graph networks, such that node and edge attributes are not restricted to invariant scalars, but can contain covariant information, such as vectors or tensors. This model, composed of steerable MLPs, is able to incorporate geometric and physical information… 

SE(3) Equivariant Graph Neural Networks with Complete Local Frames

Inspired by differential geometry and physics, equivariant local complete frames are introduced to graph neural networks, such that tensor information at given orders can be projected onto the frames, and the method is computationally efficient.

MACE: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields

This work introduces MACE, a new equivariant MPNN model that uses higher body order messages and shows that using four-body messages reduces the required number of message passing iterations to just two, resulting in a fast and highly parallelizable model, reaching or exceeding state-of-the-art accuracy on the rMD17, 3BPA, and AcAc benchmark tasks.

Learning Local Equivariant Representations for Large-Scale Atomistic Dynamics

Allegro is introduced, a strictly local equivariant deep learning interatomic potential that simultaneously exhibits excellent accuracy and scalability of parallel computation and remarkable generalization to out-of-distribution data.

Steerable Partial Differential Operators for Equivariant Neural Networks

This work derives a G-steerability constraint that completely characterizes when a PDO between feature vector fields is equivariant, for arbitrary symmetry groups G, and develops a framework forEquivariant maps based on Schwartz distributions that unifies classical convolutions and differential operators and gives insight about the relation between the two.

Hierarchical Learning in Euclidean Neural Networks

This work examines the role of higher order (non-scalar) features in Euclidean Neural Networks (e3nn) and finds a natural hierarchy of features by l, reminiscent of a multipole expansion, to ultimately inform design principles and choices of domain applications for e3nn networks.

Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs

To better adapt Transformers to 3D graphs, a novel equivariant graph attention is proposed, which considers both content and geometric information such as relative position contained in irreps features.

Learning Symmetric Embeddings for Equivariant World Models

This work proposes learning symmetric embedding networks (SENs) that encode an input space that transforms in a known manner under these oper-ations, and demonstrates that SENs facilitate the application of equivariant networks to data with complex symmetry representations.

Geometrically Equivariant Graph Neural Networks: A Survey

This work analyzes and classify existing methods into three groups regarding how the message passing and aggregation in GNNs are represented, and summarizes the benchmarks as well as the related datasets to facilitate later researches for methodology development and experimental evaluation.

ACMP: Allen-Cahn Message Passing with Attractive and Repulsive Forces for Graph Neural Networks

ACMP provides a deep model of GNNs circumventing the common GNN problem of oversmoothing and achieves state of the art performance for real-world node classification tasks on both homophilic and heterophilic datasets.

e3nn: Euclidean Neural Networks

We present e3nn , a generalized framework for creating E(3) equivariant trainable functions, also known as Euclidean neural networks. e3nn nat-urally operates on geometry and geometric tensors that

References

SHOWING 1-10 OF 67 REFERENCES

A Wigner-Eckart Theorem for Group Equivariant Convolution Kernels

By generalizing the famous Wigner-Eckart theorem for spherical tensor operators, it is proved that steerable kernel spaces are fully understood and parameterized in terms of 1) generalized reduced matrix elements, 2) Clebsch-Gordan coefficients, and 3) harmonic basis functions on homogeneous spaces.

A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups

This work provides a completely general algorithm for solving for the equivariant layers of matrix groups and constructs multilayer perceptrons equivariants to multiple groups that have never been tackled before, including the Rubik’s cube group.

Directional Message Passing for Molecular Graphs

This work proposes a message passing scheme analogous to belief propagation, which uses the directional information by transforming messages based on the angle between them, and uses spherical Bessel functions to construct a theoretically well-founded, orthogonal radial basis that achieves better performance than the currently prevalent Gaussian radial basis functions while using more than 4x fewer parameters.

Vector Neurons: A General Framework for SO(3)-Equivariant Networks

Invariance and equivariance to the rotation group have been widely discussed in the 3D deep learning community for pointclouds. Yet most proposed methods either use complex mathematical tools that

General E(2)-Equivariant Steerable CNNs

The theory of Steerable CNNs yields constraints on the convolution kernels which depend on group representations describing the transformation laws of feature spaces, and it is shown that these constraints for arbitrary group representations can be reduced to constraints under irreducible representations.

Generalizing Convolutional Neural Networks for Equivariance to Lie Groups on Arbitrary Continuous Data

A general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group with a surjective exponential map is proposed, enabling rapid prototyping and exact conservation of linear and angular momentum.

E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials

The NequIP method achieves state-of-the-art accuracy on a challenging and diverse set of molecules and materials while exhibiting remarkable data efficiency, challenging the widely held belief that deep neural networks require massive training sets.

Coordinate Independent Convolutional Networks - Isometry and Gauge Equivariant Convolutions on Riemannian Manifolds

The generality of the differential geometric formulation of convolutional networks is demonstrated by an extensive literature review which explains a large number of Euclidean CNNs, spherical CNNs and CNNs on general surfaces as specific instances of coordinate independent convolutions.

3D Steerable CNNs: Learning Rotationally Equivariant Features in Volumetric Data

The experimental results confirm the effectiveness of 3D Steerable CNNs for the problem of amino acid propensity prediction and protein structure classification, both of which have inherent SE(3) symmetry.

Tensor Field Networks: Rotation- and Translation-Equivariant Neural Networks for 3D Point Clouds

Tensor field neural networks are introduced, which are locally equivariant to 3D rotations, translations, and permutations of points at every layer, and demonstrate the capabilities of tensor field networks with tasks in geometry, physics, and chemistry.
...