DiffMG: Differentiable Meta Graph Search for Heterogeneous Graph Neural Networks

  title={DiffMG: Differentiable Meta Graph Search for Heterogeneous Graph Neural Networks},
  author={Yuhui Ding and Quanming Yao and Huan Zhao and Tong Zhang},
  journal={Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery \& Data Mining},
  • Yuhui Ding, Quanming Yao, Tong Zhang
  • Published 7 October 2020
  • Computer Science
  • Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining
In this paper, we propose a novel framework to automatically utilize task-dependent semantic information which is encoded in heterogeneous information networks (HINs). Specifically, we search for a meta graph, which can capture more complex semantic relations than a meta path, to determine how graph neural networks (GNNs) propagate messages along different types of edges. We formalize the problem within the framework of neural architecture search (NAS) and then perform the search in a… 

Figures and Tables from this paper

Relation Embedding based Graph Neural Networks for Handling Heterogeneous Graph

Heterogeneous graph learning has drawn significant attentions in recent years, due to the success of graph neural networks (GNNs) and the broad applications of heterogeneous information networks.


  • Tianyu ZhaoCheng Yang Chuan Shi
  • Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
  • 2022

NAS-Bench-Graph: Benchmarking Graph Neural Architecture Search

This work constructs a unified, expressive yet compact search space, covering 26,206 unique graph neural network (GNN) architectures and proposes a principled evaluation protocol, which enables fair, fully reproducible, and efficient comparisons for GraphNAS.

Bridging the Gap of AutoGraph Between Academia and Industry: Analyzing AutoGraph Challenge at KDD Cup 2020

The gaps between academia and industry on modeling scope, effectiveness, and efficiency are quantified, and it is shown that academic AutoML for Graph solutions focus on GNN architecture search while industrial solutions, especially the winning ones in the KDD Cup, tend to obtain an overall solution with only neural architecture search.

Automated Machine Learning for Deep Recommender Systems: A Survey

An overview of AutoML for DRS models and the related techniques is provided, including the state-of-the-art AutoML approaches that automate the feature selection, feature embeddings, feature interactions, and system design in DRS.

Space4HGNN: A Novel, Modularized and Reproducible Platform to Evaluate Heterogeneous Graph Neural Network

A unified framework covering most HGNNs is proposed, consisting of three components: heterogeneous linear transformation, heterogeneous graph transformation, and heterogeneous message passing layer, and a platform Space4HGNN is built, which offers modularized components, reproducible implementations, and standardized evaluation for HGNN's.

Automated Graph Machine Learning: Approaches, Libraries and Directions

This paper extensively discusses automated graph machine approaches, covering hyper-parameter optimization (HPO) and neural architecture search (NAS) for graph machine learning, in the first systematic and comprehensive discussion of approaches, libraries as well as directions for automated graphMachine learning.

Designing the Topology of Graph Neural Networks: A Novel Feature Fusion Perspective

F2GNN can improve the model capacity while alleviating the deficiencies of existing GNN topology design manners by utilizing different levels of features adaptively and develops a neural architecture search method which contains a set of selection and fusion operations in the search space and an improved differentiable search algorithm.

Profiling the Design Space for Graph Neural Networks based Collaborative Filtering

This work makes the first attempt to profile the design space of GNN-based CF methods to enrich the understanding of different design dimensions as well as provide a novel paradigm of model design.

Knowledge Graph Reasoning with Relational Digraph

This paper introduces a novel relational structure, i.e., relational directed graph (r-digraph), which is composed of overlapped relational paths, to capture the KG’s local evidence and demonstrates that RED-GNN is not only efficient but also can achieve significant performance gains in both inductive and transductive reasoning tasks over existing methods.



DARTS: Differentiable Architecture Search

The proposed algorithm excels in discovering high-performance convolutional architectures for image classification and recurrent architectures for language modeling, while being orders of magnitude faster than state-of-the-art non-differentiable techniques.

Design Space for Graph Neural Networks

This work defines and systematically study the architectural design space for GNNs which consists of 315,000 different designs over 32 different predictive tasks, and offers a principled and scalable approach to transition from studying individual GNN designs for specific tasks, to systematically studying the GNN design space and the task space.

Genetic Meta-Structure Search for Recommendation on Heterogeneous Information Network

Genetic Meta-Structure Search (GEMS) is proposed to automatically optimize meta-structure designs for recommendation on HINs and conducts an in-depth analysis on the identified meta-structures, which sheds light on the HIN based recommender system design.

Graph Neural Architecture Search

Experiments on real-world datasets demonstrate that GraphNAS can design a novel network architecture that rivals the best human-invented architecture in terms of validation set accuracy and in a transfer learning task, it is observed that graph neural architectures designed by GraphNAS, when transferred to new datasets, still gain improvement in Terms of prediction accuracy.

Policy-GNN: Aggregation Optimization for Graph Neural Networks

Policy-GNN is proposed, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process and significantly outperforms the state-of-the-art alternatives, showing the promise in aggregation optimization for GNN's.

Heterogeneous Graph Transformer

The proposed HGT model consistently outperforms all the state-of-the-art GNN baselines by 9–21 on various downstream tasks, and the heterogeneous mini-batch graph sampling algorithm—HGSampling—for efficient and scalable training.

Heterogeneous Graph Neural Network

HetGNN, a heterogeneous graph neural network model, is proposed that can outperform state-of-the-art baselines in various graph mining tasks, i.e., link prediction, recommendation, node classification and clustering and inductive node classification & clustering.

Heterogeneous Graph Attention Network

Extensive experimental results on three real-world heterogeneous graphs not only show the superior performance of the proposed model over the state-of-the-arts, but also demonstrate its potentially good interpretability for graph analysis.

Efficient Neural Architecture Search via Proximal Iterations

This work reformulates the search process as an optimization problem with a discrete constraint on architectures and a regularizer on model complexity and proposes an efficient algorithm inspired by proximal iterations for optimization that is not only much faster than existing differentiable search methods, but also can find better architectures and balance the model complexity.

MAGNN: Metapath Aggregated Graph Neural Network for Heterogeneous Graph Embedding

This work proposes a new model named Metapath Aggregated Graph Neural Network (MAGNN), which achieves more accurate prediction results than state-of-the-art baselines and employs three major components, i.e., the node content transformation to encapsulate input node attributes, the intra-metapath aggregation to incorporate intermediate semantic nodes, and the inter-metal aggregation to combine messages from multiple metapaths.