Rethinking Graph Neural Architecture Search from Message-passing

  title={Rethinking Graph Neural Architecture Search from Message-passing},
  author={Shaofei Cai and Liang Li and Jincan Deng and Beichen Zhang and Zhengjun Zha and Li Su and Qingming Huang},
  journal={2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
Graph neural networks (GNNs) emerged recently as a standard toolkit for learning from data on graphs. Current GNN designing works depend on immense human expertise to explore different message-passing mechanisms, and require manual enumeration to determine the proper message-passing depth. Inspired by the strong searching capability of neural architecture search (NAS) in CNN, this paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space. The GNAS can automatically… 

Figures and Tables from this paper

GNN-EA: Graph Neural Network with Evolutionary Algorithm

This paper designs two novel crossover operators at different granularity levels, GNNCross and LayerCross, and presents an evolutionary graph neural network architecture search strategy, involving inheritance, crossover and mutation operators based on fine-grained atomic operations.

Automated Graph Machine Learning: Approaches, Libraries and Directions

This paper extensively discusses automated graph machine approaches, covering hyper-parameter optimization (HPO) and neural architecture search (NAS) for graph machine learning, in the first systematic and comprehensive discussion of approaches, libraries as well as directions for automated graphMachine learning.

Automatic Relation-aware Graph Network Proliferation

This work proposes Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs with a relation-guided message passing mechanism and designs a network proliferation search paradigm to progressively determine the GNN architectures by iteratively performing network division and differentiation.

GRATIS: Deep Learning Graph Representation with Task-specific Topology and Multi-dimensional Edge Features

To learn each edge’s presence and multi-dimensional feature, the GRATIS framework takes both of the corresponding vertices pair and their global contextual information into consideration, enabling the generated graph representation to have a globally optimal message passing mechanism for different down-stream tasks.

Efficient Automatic Machine Learning via Design Graphs

F ALCON is proposed, an efficiently sample-based method to search for the optimal model design as a design graph, where the nodes represent design choices, and the edges denote design similarities.

Explore Contextual Information for 3D Scene Graph Generation

This paper proposes a framework fully exploring contextual information for the 3D SGG task, which attempts to satisfy the requirements of fine-grained entity class, multiple relation labels, and high accuracy simultaneously.

A Graph Architecture Search Method Based On Grouped Operations

This work proposes a graph architecture search method to decrease the instability with a large number of candidate operations, following SANE(Search to Aggregate NEighborhood), but it focuses on searching to aggregate neighbourhoods but it divides the candidate operations into groups.

Space Construction

Compared with existing state-of-the-art literature, the proposed HTAS and HGNAS models can discover more efficient neural architectures for different target hardware on a variety of datasets, which validates the effectiveness of the proposed methods.

Graph Neural Architecture Search Under Distribution Shifts

This work designs a self-supervised disentangled graph encoder to characterize invariant factors hidden in diverse graph structures and proposes a prototype based architecture self-customization strategy to generate the most suitable GNN architecture weights in a continuous space for each graph instance.



Graph Neural Architecture Search

Experiments on real-world datasets demonstrate that GraphNAS can design a novel network architecture that rivals the best human-invented architecture in terms of validation set accuracy and in a transfer learning task, it is observed that graph neural architectures designed by GraphNAS, when transferred to new datasets, still gain improvement in Terms of prediction accuracy.

Auto-GNN: Neural architecture search of graph neural networks

An automated graph neural networks (AGNN) framework is proposed, which aims to find the optimal GNN architecture efficiently and achieves the best performance and search efficiency, comparing with existing human-invented models and the traditional search methods.

How Powerful are Graph Neural Networks?

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

Neural Architecture Search in Graph Neural Networks

Two NAS methods for optimizing GNN are compared: one based on reinforcement learning and a second based on evolutionary algorithms, showing that both methods obtain similar accuracies to a random search, raising the question of how many of the search space dimensions are actually relevant to the problem.

A Comprehensive Survey on Graph Neural Networks

This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.

Residual Gated Graph ConvNets

This work reviews existing graph RNN and ConvNet architectures, and proposes natural extension of LSTM and Conv net to graphs with arbitrary size, and designs a set of analytically controlled experiments on two basic graph problems to test the different architectures.

Benchmarking Graph Neural Networks

A reproducible GNN benchmarking framework is introduced, with the facility for researchers to add new models conveniently for arbitrary datasets, and a principled investigation into the recent Weisfeiler-Lehman GNNs (WL-GNNs) compared to message passing-based graph convolutional networks (GCNs).

Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior

The Graph Neural Network Model

A new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains, and implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space.

DropEdge: Towards Deep Graph Convolutional Networks on Node Classification

DropEdge is a general skill that can be equipped with many other backbone models (e.g. GCN, ResGCN, GraphSAGE, and JKNet) for enhanced performance and consistently improves the performance on a variety of both shallow and deep GCNs.