Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs

@article{Liao2022EquiformerEG,
  title={Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs},
  author={Yi Liao and Tess E. Smidt},
  journal={ArXiv},
  year={2022},
  volume={abs/2206.11990}
}
3D-related inductive biases like translational invariance and rotational equivariance are indispensable to graph neural networks operating on 3D atomistic graphs such as molecules. Inspired by the success of Transformers in various domains, we study how to incorporate these inductive biases into Transformers. In this paper, we present Equiformer, a graph neural network leveraging the strength of Transformer architectures and incorporating SE (3) / E (3) -equivariant features based on… 
1 Citations

Holographic-(V)AE: an end-to-end SO(3)-Equivariant (Variational) Autoencoder in Fourier Space

Group-equivariant neural networks have emerged as a data-efficient approach to solve classification and regression tasks, while respecting the relevant symmetries of the data. However, little work

References

SHOWING 1-10 OF 86 REFERENCES

Equivariant Graph Attention Networks for Molecular Property Prediction

This work proposes an equivariant GNN that operates with Cartesian coordinates to incorporate directionality and implements a novel attention mechanism, acting as a content and spatial dependent filter when propagating information between nodes.

Relevance of Rotationally Equivariant Convolutions for Predicting Molecular Properties

This paper finds that for fixed network depth, adding angular features improves the accuracy on most targets, and beats previous state-of-the-art results on the global electronic properties dipole moment, isotropic polarizability, and electronic spatial extent.

SE(3)-Transformers: 3D Roto-Translation Equivariant Attention Networks

The SE(3)-Transformer is introduced, a variant of the self-attention module for 3D point clouds, which is equivariant under continuous 3D roto-translations, which achieves competitive performance on two real-world datasets, ScanObjectNN and QM9.

GemNet: Universal Directional Graph Neural Networks for Molecules

This work shows that GNNs with directed edge embeddings and two-hop message passing are indeed universal approximators for predictions that are invariant to translation, and equivariant to permutation and rotation, and proposes the geometric message passing neural network (GemNet).

Rethinking Graph Transformers with Spectral Attention

The Spectral Attention Network (SAN), which uses a learned positional encoding (LPE) that can take advantage of the full Laplacian spectrum to learn the position of each node in a given graph, is presented, becoming the first fully-connected architecture to perform well on graph benchmarks.

Generalizing Convolutional Neural Networks for Equivariance to Lie Groups on Arbitrary Continuous Data

A general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group with a surjective exponential map is proposed, enabling rapid prototyping and exact conservation of linear and angular momentum.

A Generalization of Transformer Networks to Graphs

A graph transformer with four new properties compared to the standard model, which closes the gap between the original transformer, which was designed for the limited case of line graphs, and graph neural networks, that can work with arbitrary graphs.

TorchMD-NET: Equivariant Transformers for Neural Network based Molecular Potentials

The prediction of quantum mechanical properties is historically plagued by a trade-off between accuracy and speed. Machine learning potentials have previously shown great success in this domain,

Equivariant message passing for the prediction of tensorial properties and molecular spectra

This work proposes the polarizable atom interaction neural network (PAINN) and improves on common molecule benchmarks over previous networks, while reducing model size and inference time and leverage the equivariant atomwise representations obtained by PAINN for the prediction of tensorial properties.

Rotation Invariant Graph Neural Networks using Spin Convolutions

A novel approach to modeling angular information between sets of neighboring atoms in a graph neural network is introduced and rotation invariance is achieved for the network’s edge messages through the use of a per-edge local coordinate frame and a novel spin convolution over the remaining degree of freedom.
...