Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks

@article{Ahn2022DescentSO,
  title={Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks},
  author={Hongjoon Ahn and You‐Jun Yang and Quan Gan and David Paul Wipf and Taesup Moon},
  journal={ArXiv},
  year={2022},
  volume={abs/2206.11081}
}
Heterogeneous graph neural networks (GNNs) achieve strong performance on node classification tasks in a semi-supervised learning setting. However, as in the simpler homogeneous GNN case, message-passing-based heterogeneous GNNs may struggle to balance between resisting the oversmoothing occuring in deep models and capturing long-range dependencies graph structured data. Moreover, the complexity of this trade-off is compounded in the heterogeneous graph case due to the disparate heterophily… 

Tables from this paper

References

SHOWING 1-10 OF 38 REFERENCES

Interpreting and Unifying Graph Neural Networks with An Optimization Framework

TLDR
A surprising connection is established between different propagation mechanisms with a unified optimization problem, showing that despite the proliferation of various GNNs, in fact, their proposed propagation mechanisms are the optimal solution optimizing a feature fitting function over a wide class of graph kernels with a graph regularization term.

Revisiting Graph Convolutional Network on Semi-Supervised Node Classification from an Optimization Perspective

TLDR
A universal theoretical framework of GCN is established from an optimization perspective and a novel convolutional kernel named GCN+ is derived which has lower parameter amount while relieving the over-smoothing inherently.

Graph Neural Networks Exponentially Lose Expressive Power for Node Classification

TLDR
The theory enables us to relate the expressive power of GCNs with the topological information of the underlying graphs inherent in the graph spectra and provides a principled guideline for weight normalization of graph NNs.

A Unified View on Graph Neural Networks as Graph Signal Denoising

TLDR
It is established mathematically that the aggregation processes in a group of representative GNN models including GCN, GAT, PPNP, and APPNP can be regarded as solving a graph denoising problem with a smoothness assumption.

GPT-GNN: Generative Pre-Training of Graph Neural Networks

TLDR
The GPT-GNN framework to initialize GNNs by generative pre-training introduces a self-supervised attributed graph generation task to pre-train a GNN so that it can capture the structural and semantic properties of the graph.

MAGNN: Metapath Aggregated Graph Neural Network for Heterogeneous Graph Embedding

TLDR
This work proposes a new model named Metapath Aggregated Graph Neural Network (MAGNN), which achieves more accurate prediction results than state-of-the-art baselines and employs three major components, i.e., the node content transformation to encapsulate input node attributes, the intra-metapath aggregation to incorporate intermediate semantic nodes, and the inter-metal aggregation to combine messages from multiple metapaths.

Graph Neural Networks Inspired by Classical Iterative Algorithms

TLDR
A new family of GNN layers designed to mimic and integrate the update rules of two classical iterative algorithms, namely, proximal gradient descent and iterative reweighted least squares (IRLS), resulting in an extremely simple yet robust model.

Are we really making much progress?: Revisiting, benchmarking and refining heterogeneous graph neural networks

TLDR
It is found that the simple homogeneous GNNs, e.g., GCN and GAT, are largely underestimated due to improper settings, and a simple but very strong baseline Simple-HGN is introduced, which significantly outperforms all previous models on HGB to accelerate the advancement of HGNNs in the future.

metapath2vec: Scalable Representation Learning for Heterogeneous Networks

TLDR
Two scalable representation learning models, namely metapath2vec and metapATH2vec++, are developed that are able to not only outperform state-of-the-art embedding models in various heterogeneous network mining tasks, but also discern the structural and semantic correlations between diverse network objects.

Deep Graph Library: A Graph-Centric, Highly-Performant Package for Graph Neural Networks.

TLDR
DGL distills the computational patterns of GNNs into a few generalized sparse tensor operations suitable for extensive parallelization and allows users to easily port and leverage the existing components across multiple deep learning frameworks.