Large-scale graph representation learning with very deep GNNs and self-supervision
@article{Addanki2021LargescaleGR, title={Large-scale graph representation learning with very deep GNNs and self-supervision}, author={Ravichandra Addanki and Peter W. Battaglia and David Budden and Andreea Deac and Jonathan Godwin and Thomas Keck and Wai Lok Sibon Li and Alvaro Sanchez-Gonzalez and Jacklynn Stott and Shantanu Thakoor and Petar Velivckovi'c}, journal={ArXiv}, year={2021}, volume={abs/2107.09422} }
Effectively and efficiently deploying graph neural networks (GNNs) at scale remains one of the most challenging aspects of graph representation learning. Many powerful solutions have only ever been validated on comparatively small datasets, often with counter-intuitive outcomes—a barrier which has been broken by the Open Graph Benchmark Large-Scale Challenge (OGB-LSC). We entered the OGB-LSC with two large-scale GNNs: a deep transductive node classifier powered by bootstrapping, and a very deep…
13 Citations
Large-Scale Representation Learning on Graphs via Bootstrapping
- Computer Science
- 2021
Bootstrapped Graph Latents is introduced a graph representation learning method that learns by predicting alternative augmentations of the input and is thus scalable by design, and can be scaled up to extremely large graphs with hundreds of millions of nodes in the semi-supervised regime.
On Representation Knowledge Distillation for Graph Neural Networks
- Computer ScienceIEEE transactions on neural networks and learning systems
- 2022
Experiments show that G-CRD consistently boosts the performance and robustness of lightweight GNNs, outperforming LSP (and a global structure preserving (GSP) variant of LSP) as well as baselines from 2-D computer vision.
OGB-LSC: A Large-Scale Challenge for Machine Learning on Graphs
- Computer ScienceNeurIPS Datasets and Benchmarks
- 2021
It is shown that expressive models significantly outperform simple scalable baselines, indicating an opportunity for dedicated efforts to further improve graph ML at scale.
Ordered Subgraph Aggregation Networks
- Computer ScienceArXiv
- 2022
It is shown that increasing subgraph size always increases the expressive power and a better understanding of their limitations is developed by relating them to the established k - WL hierarchy.
Graph Neural Network Training with Data Tiering
- Computer ScienceArXiv
- 2021
The data tiering method not only utilizes the structure of input graph, but also an insight gained from actual GNN training process to achieve a higher prediction result and a new data placement and access strategy to further minimize the CPU-GPU communication overhead.
Affinity-Aware Graph Networks
- Computer ScienceArXiv
- 2022
This paper explores the use of affinity measures as features in graph neural networks, in particular measures arising from random walks, including effective resistance, hitting and commute times, and proposes message passing networks based on these features.
Extreme Acceleration of Graph Neural Network-based Prediction Models for Quantum Chemistry
- Computer ScienceArXiv
- 2022
It is demonstrated that such a co-design approach can reduce the training time of such molecular property prediction models from days to less than less than two hours, opening new possibilities for AI-driven scientific discovery.
Unified 2D and 3D Pre-Training of Molecular Representations
- Computer ScienceKDD
- 2022
This work proposes a new representation learning method based on a unified 2D and 3D pre-training that achieves state-of-the-art results on 10 tasks, and the average improvement on 2D-only tasks is 8.3%.
MDGNN: M ETAPATH - BASED D ECOUPLED G RAPH N EURAL N ETWORK FOR MAG240M-LSC
- Computer Science
- 2022
This work proposes a metapath-based decoupled graph neural network (MDGNN), which incorporates more effective features to get better results and finally gets an award-level performance on the MAG240M track of OGB-LSC 2022.
Molecule3D: A Benchmark for Predicting 3D Geometries from Molecular Graphs
- Computer Science, ChemistryArXiv
- 2021
This work proposes to predict the ground-state 3D geometries from molecular graphs using machine learning methods using density functional theory (DFT), and implements two baseline methods that either predict the pairwise distance between atoms or atom coordinates in 3D space.
References
SHOWING 1-10 OF 65 REFERENCES
OGB-LSC: A Large-Scale Challenge for Machine Learning on Graphs
- Computer ScienceNeurIPS Datasets and Benchmarks
- 2021
It is shown that expressive models significantly outperform simple scalable baselines, indicating an opportunity for dedicated efforts to further improve graph ML at scale.
Bag of Tricks for Node Classification with Graph Neural Networks
- Computer Science
- 2021
This paper proposes several new tricks-of-the-trade related to label usage, loss function formulation, and model design that can significantly improve various GNN architectures and empirically evaluates their impact on final node classification accuracy.
DropEdge: Towards Deep Graph Convolutional Networks on Node Classification
- Computer ScienceICLR
- 2020
DropEdge is a general skill that can be equipped with many other backbone models (e.g. GCN, ResGCN, GraphSAGE, and JKNet) for enhanced performance and consistently improves the performance on a variety of both shallow and deep GCNs.
Pitfalls of Graph Neural Network Evaluation
- Computer ScienceArXiv
- 2018
This paper performs a thorough empirical evaluation of four prominent GNN models and suggests that simpler GNN architectures are able to outperform the more sophisticated ones if the hyperparameters and the training procedure are tuned fairly for all models.
FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling
- Computer ScienceICLR
- 2018
Enhanced with importance sampling, FastGCN not only is efficient for training but also generalizes well for inference, and is orders of magnitude more efficient while predictions remain comparably accurate.
SIGN: Scalable Inception Graph Neural Networks
- Computer ScienceArXiv
- 2020
This paper proposes a new, efficient and scalable graph deep learning architecture which sidesteps the need for graph sampling by using graph convolutional filters of different size that are amenable to efficient precomputation, allowing extremely fast training and inference.
Graph Convolutional Neural Networks for Web-Scale Recommender Systems
- Computer ScienceKDD
- 2018
A novel method based on highly efficient random walks to structure the convolutions and a novel training strategy that relies on harder-and-harder training examples to improve robustness and convergence of the model are developed.
Deep Graph Contrastive Representation Learning
- Computer ScienceArXiv
- 2020
This paper proposes a novel framework for unsupervised graph representation learning by leveraging a contrastive objective at the node level, and generates two graph views by corruption and learns node representations by maximizing the agreement of node representations in these two views.
Bootstrapped Representation Learning on Graphs
- Computer ScienceArXiv
- 2021
This work presents Bootstrapped Graph Latents, BGRL, a self-supervised graph representation method that outperforms or matches the previous unsupervised state-ofthe-art results on several established benchmark datasets and enables the effective usage of graph attentional (GAT) encoders, allowing us to further improve the state of the art.
GraphSAINT: Graph Sampling Based Inductive Learning Method
- Computer ScienceICLR
- 2020
GraphSAINT is proposed, a graph sampling based inductive learning method that improves training efficiency in a fundamentally different way and can decouple the sampling process from the forward and backward propagation of training, and extend GraphSAINT with other graph samplers and GCN variants.