• Corpus ID: 231855369

Spherical Message Passing for 3D Graph Networks

@article{Liu2021SphericalMP,
  title={Spherical Message Passing for 3D Graph Networks},
  author={Yi Liu and Limei Wang and Meng Liu and Xuan Zhang and Bora Oztekin and Shuiwang Ji},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.05013}
}
We consider representation learning from 3D graphs in which each node is associated with a spatial position in 3D. This is an under explored area of research, and a principled framework is currently lacking. In this work, we propose a generic framework, known as the 3D graph network (3DGN), to provide a unified interface at different levels of granularity for 3D graphs. Built on 3DGN, we propose the spherical message passing (SMP) as a novel and specific scheme for realizing the 3DGN framework… 
3D Infomax improves GNNs for Molecular Property Prediction
TLDR
This work uses existing 3D molecular datasets to pre-train a model to reason about the geometry of molecules given only their 2D molecular graphs, and maximizes the mutual information between learned 3D summary vectors and the representations of a graph neural network (GNN).
Computation and Machine Learning for Catalyst Discovery
Towards a sustainable energy future, it is essential to develop new catalysts with improved properties for key catalytic systems such as Haber-Bosch process, water electrolysis and fuel cell.
Pre-training Molecular Graph Representation with 3D Geometry
TLDR
The Graph Multi-View Pre-training (GraphMVP) framework where self-supervised learning (SSL) is performed by leveraging the correspondence and consistency between 2D topological structures and 3D geometric views, effectively learns a 2D molecular graph encoder that is enhanced by richer and more discriminative 3D geometry.
Geometric and Physical Quantities improve E(3) Equivariant Message Passing
TLDR
Steerable E( 3 ) Equivariant Graph Neural Networks (SEGNNs) that generalise equivariant graph networks, such that node and edge attributes are not restricted to invariant scalars, but can contain covariant information, such as vectors or tensors.
Rotation Invariant Graph Neural Networks using Spin Convolutions
TLDR
A novel approach to modeling angular information between sets of neighboring atoms in a graph neural network is introduced and rotation invariance is achieved for the network’s edge messages through the use of a per-edge local coordinate frame and a novel spin convolution over the remaining degree of freedom.
DIG: A Turnkey Library for Diving into Graph Deep Learning Research
TLDR
DIG: Dive into Graphs is a research-oriented library that integrates unified and extensible implementations of common graph deep learning algorithms for several advanced tasks, and provides unified implementations of data interfaces, common algorithms, and evaluation metrics.
3D Equivariant Molecular Graph Pretraining
TLDR
This work proposes to adopt an equivariant energy-based model as the backbone for pretraining, which enjoys the merit of fulfilling the symmetry of 3D space, and develops a node-level pretraining loss for force prediction, where the Riemann-Gaussian distribution is exploited to ensure the loss to be E(3)-invariant, enabling more robustness.
Spherical Channels for Modeling Atomic Interactions
TLDR
It is demonstrated, that by rotating the embeddings based on the 3D edge orientation, more information may be utilized while maintaining the rotational equivariance of the messages, by relaxing this constraint in both message passing and aggregation.
Molecular Geometry Pretraining with SE(3)-Invariant Denoising Distance Matching
TLDR
A 3D coordinate denoising pretraining framework to model such an energy landscape based on the dynamic nature of 3D molecules, where the continuous motion of a molecule in the 3D Euclidean space forms a smooth potential energy surface.
Approximate Equivariance SO(3) Needlet Convolution
This paper develops a rotation-invariant needlet convolution for rotation group SO(3) to distill multiscale information of spherical signals. The spherical needlet transform is generalized from S
...
...

References

SHOWING 1-10 OF 49 REFERENCES
Machine learning of accurate energy-conserving molecular force fields
TLDR
The GDML approach enables quantitative molecular dynamics simulations for molecules at a fraction of cost of explicit AIMD calculations, thereby allowing the construction of efficient force fields with the accuracy and transferability of high-level ab initio methods.
Density Functional Theory: A Practical Introduction
Chapter 1: What is Density Functional Theory? 1.1 How To Approach This Book. 1.2 Examples of DFT in Action. 1.3 The Schrodinger Equation. 1.4 Density Functional Theory - From Wavefunctions to
The Open Catalyst 2020 (OC20) Dataset and Community Challenges
TLDR
The OC20 dataset is developed, consisting of 1,281,121 Density Functional Theory relaxations across a wide swath of materials, surfaces, and adsorbates, and three state-of-the-art graph neural network models were applied to each of these tasks as baseline demonstrations for the community to build on.
PhysNet: A Neural Network for Predicting Energies, Forces, Dipole Moments, and Partial Charges.
TLDR
PhysNet is introduced, a DNN architecture designed for predicting energies, forces, and dipole moments of chemical systems, and it is shown that explicitly including electrostatics in energy predictions is crucial for a qualitatively correct description of the asymptotic regions of a potential energy surface (PES).
Relational inductive biases, deep learning, and graph networks
TLDR
It is argued that combinatorial generalization must be a top priority for AI to achieve human-like abilities, and that structured representations and computations are key to realizing this objective.
Quantum chemistry structures and properties of 134 kilo molecules
TLDR
This data set provides quantum chemical properties for a relevant, consistent, and comprehensive chemical space of small organic molecules that may serve the benchmarking of existing methods, development of new methods, such as hybrid quantum mechanics/machine learning, and systematic identification of structure-property relationships.
SchNet: A continuous-filter convolutional neural network for modeling quantum interactions
TLDR
This work proposes to use continuous-filter convolutional layers to be able to model local correlations without requiring the data to lie on a grid, and obtains a joint model for the total energy and interatomic forces that follows fundamental quantum-chemical principles.
Neural Message Passing for Quantum Chemistry
TLDR
Using MPNNs, state of the art results on an important molecular property prediction benchmark are demonstrated and it is believed future work should focus on datasets with larger molecules or more accurate ground truth labels.
Introduction to Quantum Mechanics
The purpose of this contribution is to give a very brief introduction to Quantum Mechanics for an audience of mathematicians. I will follow Segal's approach to Quantum Mechanics paying special
Fast and Uncertainty-Aware Directional Message Passing for Non-Equilibrium Molecules
TLDR
The DimeNet++ model is proposed, which is 8x faster and 10% more accurate than the original Dime net on the QM9 benchmark of equilibrium molecules, and ensembling and mean-variance estimation for uncertainty quantification are investigated with the goal of accelerating the exploration of the vast space of non-equilibrium structures.
...
...