Corpus ID: 237492193

Is Heterophily A Real Nightmare For Graph Neural Networks To Do Node Classification?

@article{Luan2021IsHA,
  title={Is Heterophily A Real Nightmare For Graph Neural Networks To Do Node Classification?},
  author={Sitao Luan and Chenqing Hua and Qincheng Lu and Jiaqi Zhu and Mingde Zhao and Shuyuan Zhang and Xiao-Wen Chang and Doina Precup},
  journal={ArXiv},
  year={2021},
  volume={abs/2109.05641}
}
Graph Neural Networks (GNNs) extend basic Neural Networks (NNs) by using the graph structures based on the relational inductive bias (homophily assumption). Though GNNs are believed to outperform NNs in real-world tasks, performance advantages of GNNs over graph-agnostic NNs seem not generally satisfactory. Heterophily has been considered as a main cause and numerous works have been put forward to address it. In this paper, we first show that not all cases of heterophily are harmful 1 for GNNs… Expand
Is Homophily a Necessity for Graph Neural Networks?
TLDR
This work empirically finds that standard graph convolutional networks (GCNs) can actually achieve better performance than such carefully designed methods on some commonly used heterophilous graphs, and considers whether homophily is truly necessary for good GNN performance. Expand

References

SHOWING 1-10 OF 45 REFERENCES
Simple Truncated SVD based Model for Node Classification on Heterophilic Graphs
TLDR
This work proposes a simple alternative method that exploits Truncated Singular Value Decomposition (TSVD) of topological structure and node features and achieves up to ∼30% improvement in performance over state-of-theart methods on heterophilic graphs. Expand
Predict then Propagate: Graph Neural Networks meet Personalized PageRank
TLDR
This paper uses the relationship between graph convolutional networks (GCN) and PageRank to derive an improved propagation scheme based on personalized PageRank, and constructs a simple model, personalized propagation of neural predictions (PPNP), and its fast approximation, APPNP. Expand
Non-Local Graph Neural Networks
TLDR
This work proposes a simple yet effective non-local aggregation framework with an efficient attention-guided sorting for GNNs that significantly outperform previous state-of-the-art methods on six benchmark datasets of disassortative graphs, in terms of both model performance and efficiency. Expand
Representation Learning on Graphs with Jumping Knowledge Networks
TLDR
This work explores an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation in graphs. Expand
Geom-GCN: Geometric Graph Convolutional Networks
TLDR
The proposed aggregation scheme is permutation-invariant and consists of three modules, node embedding, structural neighborhood, and bi-level aggregation, and an implementation of the scheme in graph convolutional networks, termed Geom-GCN, to perform transductive learning on graphs. Expand
Measuring and Improving the Use of Graph Information in Graph Neural Networks
TLDR
A context-surrounding GNN framework is introduced and a new, improved GNN model, called CS-GNN, is devised to improve the use of graph information based on the smoothness values of a graph. Expand
Revisiting Graph Neural Networks: All We Have is Low-Pass Filters
TLDR
The results indicate that graph neural networks only perform low-pass filtering on feature vectors and do not have the non-linear manifold learning property, and some insights on GCN-based graph neural network design are proposed. Expand
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of priorExpand
Beyond Low-frequency Information in Graph Convolutional Networks
TLDR
An experimental investigation assessing the roles of low-frequency and high-frequency signals is presented, and a novel Frequency Adaptation Graph Convolutional Networks (FAGCN) with a selfgating mechanism is proposed, which can adaptively integrate different signals in the process of message passing. Expand
Inductive Representation Learning on Large Graphs
TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks. Expand
...
1
2
3
4
5
...