• Corpus ID: 232320529

DIG: A Turnkey Library for Diving into Graph Deep Learning Research

@article{Liu2021DIGAT,
  title={DIG: A Turnkey Library for Diving into Graph Deep Learning Research},
  author={Meng Liu and Youzhi Luo and Limei Wang and Yaochen Xie and Hao Yuan and Shurui Gui and Zhao Xu and Haiyang Yu and Jingtun Zhang and Yi Liu and Keqiang Yan and Bora Oztekin and Haoran Liu and Xuan Zhang and Cong Fu and Shuiwang Ji},
  journal={J. Mach. Learn. Res.},
  year={2021},
  volume={22},
  pages={240:1-240:9}
}
Although there exist several libraries for deep learning on graphs, they are aiming at implementing basic operations for graph deep learning. In the research community, implementing and benchmarking various advanced tasks are still painful and time-consuming with existing libraries. To facilitate graph deep learning research, we introduce DIG: Dive into Graphs, a turnkey library that provides a unified testbed for higher level, research-oriented graph deep learning tasks. Currently, we consider… 

Figures from this paper

PyTorch Geometric Signed Directed: A Survey and Software on Graph Neural Networks for Signed and Directed Graphs
TLDR
This paper presents PyTorch Geometric Signed Directed, a survey and software on graph neural networks (GNNs) specially designed for directed networks, and presents the deep learning framework, which consists of easy-to-use GNN models, synthetic and real-world data, as well as task-specific evaluation metrics and loss functions for signed and directed networks.
Score-based Generative Modeling of Graphs via the System of Stochastic Differential Equations
TLDR
A novel score-based generative model for graphs with a continuous-time framework is proposed that is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule, demonstrating the effectiveness of the system of SDEs in modeling the node-edge relationships.
Explaining Graph-level Predictions with Communication Structure-Aware Cooperative Games
TLDR
This work revisits the appropriateness of the Shapley value for graph explanation and proposes a Graph Structure-aware eXplanation (GStarX) method, which produces qualitatively more intuitive explanations, and quantitatively improves over strong baselines on chemical graph property prediction and text graph sentiment classification.
ComENet: Towards Complete and Efficient Message Passing for 3D Molecular Graphs
Many real-world data can be modeled as 3D graphs, but learning representations that incorporates 3D information completely and efficiently is challenging. Existing methods either use partial 3D
GOOD: A Graph Out-of-Distribution Benchmark
Out-of-distribution (OOD) learning deals with scenarios in which training and test data follow different distributions. Although general OOD problems have been intensively studied in machine
GraphFM: Improving Large-Scale GNN Training via Feature Momentum
TLDR
This work proposes a new technique, named as feature momentum (FM), that uses a momentum step to incorporate historical embeddings when updating feature representations, and develops two specific algorithms, known as GraphFM-IB and GraphFMOB, that consider in-batch and out of-batch data, respectively.
DEGREE: D ECOMPOSITION B ASED E XPLANATION FOR G RAPH N EURAL N ETWORKS
TLDR
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction and designs a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
Faithful Explanations for Deep Graph Models
TLDR
This paper provides a new and general method for formally characterizing the faithfulness of explanations for GNNs and introduces k-hop Explanation with a Convolutional Core (KEC), a new explanation method that provably maximizes faithfulness to the original GNN by leveraging information about the graph structure in its adjacency matrix and its k-th power.
Generating 3D Molecules for Target Protein Binding
TLDR
This work proposes a novel and effective framework, known as GraphBP, to generate 3D molecules that bind to given proteins by placing atoms of specific types and locations to the given binding site one by one.
Task-Agnostic Graph Explanations
TLDR
A Task-Agnostic GNN Explainer (TAGE) trained under self-supervision with no knowledge of downstream tasks enables the explanation of GNN embedding models without downstream tasks and allows efficient explanation of multitask models.
...
...

References

SHOWING 1-10 OF 59 REFERENCES
Fast Graph Representation Learning with PyTorch Geometric
TLDR
PyTorch Geometric is introduced, a library for deep learning on irregularly structured input data such as graphs, point clouds and manifolds, built upon PyTorch, and a comprehensive comparative study of the implemented methods in homogeneous evaluation scenarios is performed.
Relational inductive biases, deep learning, and graph networks
TLDR
It is argued that combinatorial generalization must be a top priority for AI to achieve human-like abilities, and that structured representations and computations are key to realizing this objective.
Quantum chemistry structures and properties of 134 kilo molecules
TLDR
This data set provides quantum chemical properties for a relevant, consistent, and comprehensive chemical space of small organic molecules that may serve the benchmarking of existing methods, development of new methods, such as hybrid quantum mechanics/machine learning, and systematic identification of structure-property relationships.
Spherical Message Passing for 3D Graph Networks
TLDR
This work proposes the spherical message passing (SMP) as a novel and specific scheme for realizing the 3DGN framework in the spherical coordinate system (SCS), and derives physically-based representations of geometric information and proposes the SphereNet for learning representations of 3D graphs.
GraphEBM: Molecular Graph Generation with Energy-Based Models
TLDR
GraphEBM is proposed to generate molecular graphs using energy-based models using Langevin dynamics to parameterize the energy function in a permutations invariant manner, thus making GraphEBM permutation invariant.
CogDL: An Extensive Toolkit for Deep Learning on Graphs
TLDR
CogDL1, an extensive research toolkit for deep learning on graphs that allows researchers and developers to easily conduct experiments and build applications, is introduced and the effectiveness of CogDL for real-world applications in AMiner2, which is a large academic database and system.
Self-Supervised Learning of Graph Neural Networks: A Unified Review
TLDR
A unified review of different ways of training GNNs using SSL methods into contrastive and predictive models is provided, which sheds light on the similarities and differences of various methods, setting the stage for developing new methods and algorithms.
GraphGallery: A Platform for Fast Benchmarking and Easy Development of Graph Neural Networks Based Intelligent Software
TLDR
GraphGallery is an easy-to-use platform that allows developers to automatically deploy GNNs even with less domain-specific knowledge and offers a set of implementations of common GNN models based on mainstream deep learning frameworks.
On Explainability of Graph Neural Networks via Subgraph Explorations
TLDR
This work represents the first attempt to explain GNNs via identifying subgraphs explicitly, and proposes to use Shapley values as a measure of subgraph importance, to make the tree search more effective and expedite computations.
GraphDF: A Discrete Flow Model for Molecular Graph Generation
TLDR
This work proposes GraphDF, a novel discrete latent variable model for molecular graph generation based on normalizing flow methods that uses invertible modulo shift transforms to map discrete latent variables to graph nodes and edges.
...
...