Graph Neural Networks with Learnable Structural and Positional Representations
@article{Dwivedi2021GraphNN, title={Graph Neural Networks with Learnable Structural and Positional Representations}, author={Vijay Prakash Dwivedi and Anh Tuan Luu and Thomas Laurent and Yoshua Bengio and Xavier Bresson}, journal={ArXiv}, year={2021}, volume={abs/2110.07875} }
Graph neural networks (GNNs) have become the standard learning architectures for graphs. GNNs have been applied to numerous domains ranging from quantum chemistry, recommender systems to knowledge graphs and natural language processing. A major issue with arbitrary graphs is the absence of canonical positional information of nodes, which decreases the representation power of GNNs to distinguish e.g. isomorphic nodes and other graph symmetries. An approach to tackle this issue is to introduce…
Figures and Tables from this paper
62 Citations
Generalized Laplacian Positional Encoding for Graph Representation Learning
- Computer ScienceArXiv
- 2022
This paper draws inspiration from the recent success of Laplacian-based positional encoding and develops a novel family of positional encoding schemes for graphs by generalizing the optimization problem that Laplace embedding to more general dissimilarity functions rather than the 2-norm used in the original formulation.
Geodesic Graph Neural Network for Efficient Graph Representation Learning
- Computer ScienceArXiv
- 2022
An efficient GNN framework called Geodesic GNN (GDGNN) is proposed that requires only one GNN run and injects conditional relationships between nodes into the model without labeling and theoretically proves that GDGNN is more powerful than plain GNNs.
Affinity-Aware Graph Networks
- Computer ScienceArXiv
- 2022
This paper explores the use of affinity measures as features in graph neural networks, in particular measures arising from random walks, including effective resistance, hitting and commute times, and proposes message passing networks based on these features.
Structure-Aware Transformer for Graph Representation Learning
- Computer ScienceICML
- 2022
This work proposes the Structure-Aware Transformer, a class of simple and accessible graph Transformers built upon a new self-attention mechanism that systematically improves performance relative to the base GNN model, successfully combining the advantages of GNNs and Transformers.
Attending to Graph Transformers
- Computer ScienceArXiv
- 2023
A taxonomy of graph transformer architectures is derived, bringing some order to this emerging field by probing how well graph transformers can recover various graph properties, how well they can deal with heterophilic graphs, and to what extent they prevent over-squashing.
A Generalization of ViT/MLP-Mixer to Graphs
- Computer ScienceArXiv
- 2022
This work introduces a new class of GNNs, called Graph MLP-Mixer, that holds three key properties: they capture long-range dependency and mitigate the issue of over-squashing, offer better speed and memory efficiency with a complexity linear to the number of nodes and edges, and show high expressivity in terms of graph isomorphism.
Long Range Graph Benchmark
- Computer ScienceArXiv
- 2022
The Long Range Graph Benchmark (LRGB) 1 is presented with 5 graph learning datasets that arguably require LRI reasoning to achieve strong performance in a given task and is suitable for benchmarking and exploration of MP-GNNs and Graph Transformer architectures that are intended to capture LRI.
Transformers over Directed Acyclic Graphs
- Computer Science
- 2022
This paper studies transformers over directed acyclic graphs (DAGs) and proposes architecture adaptations tailored to DAGs: an attention mechanism that is more efficient than the regular quadratic complexity of transformers and at the same time faithfully captures the DAG structure.
Understanding and Extending Subgraph GNNs by Rethinking Their Symmetries
- Computer ScienceArXiv
- 2022
The most prominent form of subgraph methods, which employs node-based subgraph selection policies such as ego-networks or node marking and deletion, is studied and a novel Subgraph GNN dubbed SUN is designed, which theoretically unifies previous architectures while providing better empirical performance on multiple benchmarks.
Neural Graph Databases
- Computer ScienceLoG
- 2022
LPG2vec enables combining predictive power of the most powerful GNNs with the full scope of information encoded in the LPG model, paving the way for neural graph databases, a class of systems where the vast complexity of maintained data will benefit from modern and future graph machine learning methods.
References
SHOWING 1-10 OF 76 REFERENCES
How Powerful are Graph Neural Networks?
- Computer ScienceICLR
- 2019
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting
- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2023
The expressive power of the proposed Graph Substructure Networks (GSN), a topologically-aware message passing scheme based on substructure encoding, is theoretically analysed, showing that it is strictly more expressive than the WL test, and provide sufficient conditions for universality.
Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning
- Computer ScienceNeurIPS
- 2020
A general class of structure-related features, termed Distance Encoding (DE), that assists GNNs in representing any set of nodes, while providing strictly more expressive power than the 1-WL test and significantly outperform other state-of-the-art methods especially designed for the above tasks.
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks
- Computer ScienceAAAI
- 2019
It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.
A Generalization of Transformer Networks to Graphs
- Computer ScienceArXiv
- 2020
A graph transformer with four new properties compared to the standard model, which closes the gap between the original transformer, which was designed for the limited case of line graphs, and graph neural networks, that can work with arbitrary graphs.
Expressive Power of Invariant and Equivariant Graph Neural Networks
- Computer ScienceICLR
- 2021
It is proved that the first approximation guarantees for practical GNNs are proved, paving the way for a better understanding of their generalization.
GraphiT: Encoding Graph Structure in Transformers
- Computer ScienceArXiv
- 2021
The GraphiT model encodes structural and positional information by leveraging relative positional encoding strategies in self-attention scores based on positive definite kernels on graphs and enumerating and encoding local sub-structures such as paths of short length.
Parameterized Hypercomplex Graph Neural Networks for Graph Classification
- Computer ScienceICANN
- 2021
This work develops graph neural networks that leverage the properties of hypercomplex feature transformation and presents empirical evidence that the proposed model incorporates a regularization effect, alleviating the risk of overfitting.
Residual Gated Graph ConvNets
- Computer ScienceArXiv
- 2017
This work reviews existing graph RNN and ConvNet architectures, and proposes natural extension of LSTM and Conv net to graphs with arbitrary size, and designs a set of analytically controlled experiments on two basic graph problems to test the different architectures.