Automated Machine Learning on Graphs: A Survey

  title={Automated Machine Learning on Graphs: A Survey},
  author={Ziwei Zhang and Xin Wang and Wenwu Zhu},
  booktitle={International Joint Conference on Artificial Intelligence},
Machine learning on graphs has been extensively studied in both academic and industry. However, as the literature on graph learning booms with a vast number of emerging methods and techniques, it becomes increasingly difficult to manually design the optimal machine learning algorithm for different graph-related tasks. To solve this critical challenge, automated machine learning (AutoML) on graphs which combines the strength of graph machine learning and AutoML together, is gaining attention… 

Figures and Tables from this paper

Enhancing Intra-class Information Extraction for Heterophilous Graphs: One Neural Architecture Search Approach

A unified framework is proposed based on the literature, in which the intra-class information from the node itself and neighbors can be extracted based on seven carefully designed blocks, and the help of neural architecture search is proposed to provide an architecture predictor to design GNNs for each node.

Search to Pass Messages for Temporal Knowledge Graph Completion

This work proposes to use neural architecture search (NAS) to design data-specific message passing architecture for TKG completion and develops a generalized framework to explore topological and temporal information in TKGs.

Efficient Automatic Machine Learning via Design Graphs

F ALCON is proposed, an efficiently sample-based method to search for the optimal model design as a design graph, where the nodes represent design choices, and the edges denote design similarities.

Neural Architecture Search for Transformers: A Survey

An in-depth literature review of approximately 50 state-of-the-art Neural Architecture Search methods is provided, targeting the Transformer model and its family of architectures such as Bidirectional Encoder Representations from Transformers (BERT) and Vision Transformers.

A Graph Architecture Search Method Based On Grouped Operations

This work proposes a graph architecture search method to decrease the instability with a large number of candidate operations, following SANE(Search to Aggregate NEighborhood), but it focuses on searching to aggregate neighbourhoods but it divides the candidate operations into groups.

Space Construction

Compared with existing state-of-the-art literature, the proposed HTAS and HGNAS models can discover more efficient neural architectures for different target hardware on a variety of datasets, which validates the effectiveness of the proposed methods.

Graph Neural Networks with Node-wise Architecture

A framework wherein the parametric controllers decide the GNN architecture for each node based on its local patterns, which significantly outperforms state-of-the-art methods on five of the ten real-world datasets, and confirms that node-wise architecture can help GNNs become versatile models.

PSP: Progressive Space Pruning for Efficient Graph Neural Architecture Search

  • Guanghui ZhuWenjie Wang Y. Huang
  • Computer Science
    2022 IEEE 38th International Conference on Data Engineering (ICDE)
  • 2022
This paper proposes a novel and effective graph neural architecture search method called PSP from the perspective of search space design, and reveals that PSP outperforms the state-of-the-art handcrafted architectures and the existing NAS methods in terms of effectiveness and efficiency.

Graph Neural Architecture Search Under Distribution Shifts

This work designs a self-supervised disentangled graph encoder to characterize invariant factors hidden in diverse graph structures and proposes a prototype based architecture self-customization strategy to generate the most suitable GNN architecture weights in a continuous space for each graph instance.



Autonomous Graph Mining Algorithm Search with Best Speed/Accuracy Trade-off

This work proposes AutoGM, an automated system for graph mining algorithm development that integrates various message-passing based graph algorithms, ranging from conventional algorithms like PageRank to graph neural networks, in a unified framework UnifiedGM.

Graph Neural Architecture Search

Experiments on real-world datasets demonstrate that GraphNAS can design a novel network architecture that rivals the best human-invented architecture in terms of validation set accuracy and in a transfer learning task, it is observed that graph neural architectures designed by GraphNAS, when transferred to new datasets, still gain improvement in Terms of prediction accuracy.

AutoGL: A Library for Automated Graph Learning

This work proposes an automated machine learning pipeline for graph data containing four modules: auto feature engineering, model training, hyper-parameter optimization, and auto ensemble, which provides numerous state-of-the-art methods and flexible base classes and APIs, which allow easy customization.

AutoNE: Hyperparameter Optimization for Massive Network Embedding

This paper proposes a novel framework, named AutoNE, to automatically optimize the hyperparameters of a NE algorithm on massive networks, and employs a multi-start random walk strategy to sample several small sub-networks, perform each trial of configuration selection on the sampled sub-network, and design a meta-leaner to transfer the knowledge about optimal hyperparameter from the sub-nets to the original massive network.

Open Graph Benchmark: Datasets for Machine Learning on Graphs

The OGB datasets are large-scale, encompass multiple important graph ML tasks, and cover a diverse range of domains, ranging from social and information networks to biological networks, molecular graphs, source code ASTs, and knowledge graphs, indicating fruitful opportunities for future research.

Auto-GNN: Neural architecture search of graph neural networks

An automated graph neural networks (AGNN) framework is proposed, which aims to find the optimal GNN architecture efficiently and achieves the best performance and search efficiency, comparing with existing human-invented models and the traditional search methods.

Fast Graph Representation Learning with PyTorch Geometric

PyTorch Geometric is introduced, a library for deep learning on irregularly structured input data such as graphs, point clouds and manifolds, built upon PyTorch, and a comprehensive comparative study of the implemented methods in homogeneous evaluation scenarios is performed.

Explainable Automated Graph Representation Learning with Hyperparameter Importance

An explainable AutoML approach for graph representation (e-AutoGR) which utilizes explainable graph features during performance estimation and learns decorrelated importance weights for different hyperparameters in affecting the model performance through a non-linear decorrelated weighting regression is proposed.

AutoAttend: Automated Attention Representation Search

This paper proposes an automated self-attention representation model, AutoAttend, which can automatically search powerful attention representations for downstream tasks leveraging Neural Architecture Search (NAS), and designs a tailored search space for attention representation automation, which is flexible to produce effective attention representation designs.

One-shot Graph Neural Architecture Search with Dynamic Search Space

This work proposes a novel dynamic one-shot search space for multi-branch neural architectures of GNNs and demonstrates that this method outperforms the current state-of-the-art manually designed architectures and reaches competitive performance to existing GNN NAS approaches with up to 10 times of speedup.