• Corpus ID: 238259534

Motif-based Graph Self-Supervised Learning for Molecular Property Prediction

@article{Zhang2021MotifbasedGS,
  title={Motif-based Graph Self-Supervised Learning for Molecular Property Prediction},
  author={Zaixin Zhang and Qi Liu and Hao Wang and Chengqiang Lu and Chee-Kong Lee},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.00987}
}
Predicting molecular properties with data-driven methods has drawn much attention in recent years. Particularly, Graph Neural Networks (GNNs) have demonstrated remarkable success in various molecular generation and prediction tasks. In cases where labeled data is scarce, GNNs can be pre-trained on unlabeled molecular data to first learn the general semantic and structural information before being finetuned for specific tasks. However, most existing self-supervised pre-training frameworks for… 

Figures and Tables from this paper

RRLFSOR: An Efficient Self-Supervised Learning Strategy of Graph Convolutional Networks
TLDR
An efficient self-supervised learning strategy of GCNs, named randomly removed links with a fixed step at one region (RRLFSOR) is proposed, which outperforms the baseline models consistently and can be regarded as a new data augmenter to improve over-smoothing.
ProtGNN: Towards Self-Explaining Graph Neural Networks
TLDR
ProtGNN is proposed, which combines prototype learning with GNNs and provides a new perspective on the explanations ofGNNs, where the explanations are naturally derived from the case-based reasoning process and are actually used during classification.
Graph Self-Supervised Learning for Optoelectronic Properties of Organic Semiconductors
The search for new high-performance organic semiconducting molecules is challenging due to the vastness of the chemical space, machine learning methods, particularly deep learning models like graph

References

SHOWING 1-10 OF 55 REFERENCES
ASGN: An Active Semi-supervised Graph Neural Network for Molecular Property Prediction
TLDR
A novel framework called Active Semi-supervised Graph Neural Network (ASGN) is proposed by incorporating both labeled and unlabeled molecules and adopts a teacher-student framework to learn general representation that jointly exploits information from molecular structure and molecular distribution.
InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization
TLDR
Experimental results on the tasks of graph classification and molecular property prediction show that InfoGraph is superior to state-of-the-art baselines and InfoGraph* can achieve performance competitive with state- of- the-art semi-supervised models.
GPT-GNN: Generative Pre-Training of Graph Neural Networks
TLDR
The GPT-GNN framework to initialize GNNs by generative pre-training introduces a self-supervised attributed graph generation task to pre-train a GNN so that it can capture the structural and semantic properties of the graph.
Analyzing Learned Molecular Representations for Property Prediction
TLDR
A graph convolutional model is introduced that consistently matches or outperforms models using fixed molecular descriptors as well as previous graph neural architectures on both public and proprietary data sets.
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training
TLDR
Graph Contrastive Coding (GCC) is designed --- a self-supervised graph neural network pre-training framework --- to capture the universal network topological properties across multiple networks and leverage contrastive learning to empower graph neural networks to learn the intrinsic and transferable structural representations.
Molecular Property Prediction: A Multilevel Quantum Interactions Modeling Perspective
TLDR
A generalizable and transferable Multilevel Graph Convolutional neural Network (MGCN) for molecular property prediction, which represents each molecule as a graph to preserve its internal structure and directly extracts features from the conformation and spatial information followed by the multilevel interactions.
Inductive Representation Learning on Large Graphs
TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Hierarchical Graph Representation Learning with Differentiable Pooling
TLDR
DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion.
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
...
1
2
3
4
5
...