• Corpus ID: 221995727

Graph Neural Networks with Heterophily

@inproceedings{Zhu2021GraphNN,
  title={Graph Neural Networks with Heterophily},
  author={Jiong Zhu and Ryan A. Rossi and Anup B. Rao and Tung Mai and Nedim Lipka and Nesreen Ahmed and Danai Koutra},
  booktitle={AAAI},
  year={2021}
}
Graph Neural Networks (GNNs) have proven to be useful for many different practical applications. However, most existing GNN models have an implicit assumption of homophily among the nodes connected in the graph, and therefore have largely overlooked the important setting of heterophily. In this work, we propose a novel framework called CPGNN that generalizes GNNs for graphs with either homophily or heterophily. The proposed framework incorporates an interpretable compatibility matrix for… 
Generalizing Graph Neural Networks Beyond Homophily
TLDR
This work identifies a set of key designs -- ego- and neighbor-embedding separation, higher-order neighborhoods, and combination of intermediate representations -- that boost learning from the graph structure under heterophily, and combines them into a new graph convolutional neural network, H2GCN.
Improving Robustness of Graph Neural Networks with Heterophily-Inspired Designs
TLDR
This work theoretically shows that in the standard scenario in which node features exhibit homophily, impactful structural attacks always lead to increased levels of heterophily and presents two designs—(i) separate aggregators for egoand neighbor-embeddings, and (ii) a reduced scope of aggregation—that can significantly improve the robustness of GNNs.
Simple Truncated SVD based Model for Node Classification on Heterophilic Graphs
TLDR
This work proposes a simple alternative method that exploits Truncated Singular Value Decomposition (TSVD) of topological structure and node features and achieves up to ∼30% improvement in performance over state-of-theart methods on heterophilic graphs.
Graph Neural Networks Inspired by Classical Iterative Algorithms
TLDR
A new family of GNN layers designed to mimic and integrate the update rules of two classical iterative algorithms, namely, proximal gradient descent and iterative reweighted least squares (IRLS), resulting in an extremely simple yet robust model.
On the Relationship between Heterophily and Robustness of Graph Neural Networks
TLDR
It is deduced that a design principle identified to significantly improve predictive performance under heterophily—separate aggregators for egoand neighbor-embeddings—can also inherently offer increased robustness to GNNs, and models with this design can be readily combined with explicit defense mechanisms to yield improved robustness.
Powerful Graph Convolutioal Networks with Adaptive Propagation Mechanism for Homophily and Heterophily
  • Tao Wang, Rui Wang, Di Jin, Dongxiao He, Yuxiao Huang
  • Computer Science
    ArXiv
  • 2021
TLDR
A novel propagation mechanism is designed, which can automatically change the propagation and aggregation process according to homophily or heterophily between node pairs, and it is theoretically proved that the model can constrain the similarity of representations between nodes according to theirhomophily degree.
Block Modeling-Guided Graph Convolutional Neural Networks
TLDR
By incorporating block modeling into the aggregation process, GCN is able to aggregate information from homophilic and heterophilic neighbors discriminately according to their homophily degree and is compared with state-of-art methods which deal with the heterophily problem.
Graph Neural Networks with Feature and Structure Aware Random Walk
TLDR
This paper generalizes the graph Laplacian to digraph based on the proposed Feature-Aware PageRank algorithm, which simultaneously considers the graph directionality and long-distance feature similarity between nodes, and develops a model that adaptively learns the directionality of the graph, and exploits the underlying long- distance correlations between nodes.
Is Homophily a Necessity for Graph Neural Networks?
TLDR
This work empirically finds that standard graph convolutional networks (GCNs) can actually achieve better performance than such carefully designed methods on some commonly used heterophilous graphs, and considers whether homophily is truly necessary for good GNN performance.
Label-informed Graph Structure Learning for Node Classification
TLDR
This paper proposes a novel label-informed graph structure learning framework which incorporates label information explicitly through a class transition matrix and shows that this method outperforms or matches the state-of-the-art baselines.
...
1
2
3
...

References

SHOWING 1-10 OF 53 REFERENCES
Diffusion Improves Graph Learning
TLDR
This work removes the restriction of using only the direct neighbors by introducing a powerful, yet spatially localized graph convolution: Graph diffusion convolution (GDC), which leverages generalized graph diffusion and alleviates the problem of noisy and often arbitrarily defined edges in real graphs.
Learning Role-based Graph Embeddings
TLDR
The Role2Vec framework is introduced, which uses the flexible notion of attributed random walks, and serves as a basis for generalizing existing methods such as DeepWalk, node2vec, and many others that leverage random walks.
Geom-GCN: Geometric Graph Convolutional Networks
TLDR
The proposed aggregation scheme is permutation-invariant and consists of three modules, node embedding, structural neighborhood, and bi-level aggregation, and an implementation of the scheme in graph convolutional networks, termed Geom-GCN, to perform transductive learning on graphs.
Attention-based Graph Neural Network for Semi-supervised Learning
TLDR
A novel graph neural network is proposed that removes all the intermediate fully-connected layers, and replaces the propagation layers with attention mechanisms that respect the structure of the graph, and demonstrates that this approach outperforms competing methods on benchmark citation networks datasets.
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
Inductive Representation Learning on Large Graphs
TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Graph Convolutional Neural Networks for Web-Scale Recommender Systems
TLDR
A novel method based on highly efficient random walks to structure the convolutions and a novel training strategy that relies on harder-and-harder training examples to improve robustness and convergence of the model are developed.
Graph Agreement Models for Semi-Supervised Learning
TLDR
This work proposes Graph Agreement Models (GAM), which introduces an auxiliary model that predicts the probability of two nodes sharing the same label as a learned function of their features, and achieves state-of-the-art results on semi-supervised learning datasets.
Relational Similarity Machines (RSM): A Similarity-based Learning Framework for Graphs
TLDR
A similarity-based relational learning framework called Relational Similarity Machines (RSM) for networks with arbitrary relational autocorrelation is presented, designed to be fast, accurate, and flexible for learning on a wide variety of networks.
The Graph Neural Network Model
TLDR
A new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains, and implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space.
...
1
2
3
4
5
...