• Corpus ID: 238531420

3D Infomax improves GNNs for Molecular Property Prediction

@article{Stark20213DII,
  title={3D Infomax improves GNNs for Molecular Property Prediction},
  author={Hannes Stark and Dominique Beaini and Gabriele Corso and Prudencio Tossou and Christian Dallago and Stephan Gunnemann and Pietro Lio’},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.04126}
}
Molecular property prediction is one of the fastest-growing applications of deep learning with critical real-world impacts. Including 3D molecular structure as input to learned models improves their predictions for many molecular properties. However, this information is infeasible to compute at the scale required by most realworld applications. We propose pre-training a model to understand the geometry of molecules given only their 2D molecular graph. Using methods from selfsupervised learning… 
1 Citations
On Representation Knowledge Distillation for Graph Neural Networks
TLDR
Graph Contrastive Representation Distillation (G-CRD), which uses contrastive learning to align the student node embeddings to those of the teacher in a shared representation space, outperforming the structure preserving approaches, LSP and GSP, as well as baselines adapted from 2D computer vision.

References

SHOWING 1-10 OF 66 REFERENCES
MolCLR: Molecular Contrastive Learning of Representations via Graph Neural Networks
TLDR
This work presents MolCLR: Molecular Contrastive Learning of Representations via Graph Neural Networks (GNNs), a self-supervised learning framework for large unlabeled molecule datasets and proposes three novel molecule graph augmentations: atom masking, bond deletion, and subgraph removal.
Analyzing Learned Molecular Representations for Property Prediction
TLDR
A graph convolutional model is introduced that consistently matches or outperforms models using fixed molecular descriptors as well as previous graph neural architectures on both public and proprietary data sets.
MoleculeNet: A Benchmark for Molecular Machine Learning
TLDR
MoleculeNet benchmarks demonstrate that learnable representations are powerful tools for molecular machine learning and broadly offer the best performance, however, this result comes with caveats.
Learning Gradient Fields for Molecular Conformation Generation
TLDR
A novel algorithm based on recent score-based generative models to effectively estimate the gradient fields of the log density of atomic coordinates is developed, which outperforms previous state-of-the-art baselines by a significant margin.
LEARNING NEURAL GENERATIVE DYNAMICS FOR MOLECULAR CONFORMATION GENERATION
We study how to generate molecule conformations (i.e., 3D structures) from a molecular graph. Traditional methods, such as molecular dynamics, sample conformations via computationally expensive
Predicting molecular properties with covariant compositional networks.
TLDR
A neural network based machine learning algorithm which, assuming a sufficiently large training sample of actual DFT results, can instead learn to predict certain properties of molecules purely from their molecular graphs.
Neural Message Passing for Quantum Chemistry
TLDR
Using MPNNs, state of the art results on an important molecular property prediction benchmark are demonstrated and it is believed future work should focus on datasets with larger molecules or more accurate ground truth labels.
GeoMol: Torsional Geometric Generation of Molecular 3D Conformer Ensembles
TLDR
GEOMOL–an end-to-end, non-autoregressive and SE(3)-invariant machine learning approach to generate distributions of low-energy molecular 3D conformers predominantly outperforms popular open-source, commercial, or state-of-the-art machine learning models, while achieving significant speed-ups.
SchNet: A continuous-filter convolutional neural network for modeling quantum interactions
TLDR
This work proposes to use continuous-filter convolutional layers to be able to model local correlations without requiring the data to lie on a grid, and obtains a joint model for the total energy and interatomic forces that follows fundamental quantum-chemical principles.
A self-attention based message passing neural network for predicting molecular lipophilicity and aqueous solubility
TLDR
A graph-neural-network framework called self-attention-based message-passing neural network (SAMPN) is built and applied to study the relationship between chemical properties and structures in an interpretable way and can generate chemically visible and interpretable results, which can help researchers discover new pharmaceuticals and materials.
...
1
2
3
4
5
...