# Should Graph Neural Networks Use Features, Edges, Or Both?

@article{Faber2021ShouldGN, title={Should Graph Neural Networks Use Features, Edges, Or Both?}, author={Lukas Faber and Yifan Lu and Roger Wattenhofer}, journal={ArXiv}, year={2021}, volume={abs/2103.06857} }

Graph Neural Networks (GNNs) are the first choice for learning algorithms on graph data. GNNs promise to integrate (i) node features as well as (ii) edge information in an end-to-end learning algorithm. How does this promise work out practically? In this paper, we study to what extend GNNs are necessary to solve prominent graph classification problems. We find that for graph classification, a GNN is not more than the sum of its parts. We also find that, unlike features, predictions with an edge… Expand

#### One Citation

Enhancing Graph Neural Networks with Boosting

- 2021

In practice, Graph Neural Networks can often only benefit to a small extent from the informative advantage they have compared to their individual components, edges and features. For instance, in some… Expand

#### References

SHOWING 1-10 OF 31 REFERENCES

How Powerful are Graph Neural Networks?

- Computer Science, Mathematics
- ICLR
- 2019

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs. Expand

Pitfalls of Graph Neural Network Evaluation

- Computer Science, Mathematics
- ArXiv
- 2018

This paper performs a thorough empirical evaluation of four prominent GNN models and suggests that simpler GNN architectures are able to outperform the more sophisticated ones if the hyperparameters and the training procedure are tuned fairly for all models. Expand

Hierarchical Graph Representation Learning with Differentiable Pooling

- Computer Science, Mathematics
- NeurIPS
- 2018

DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion. Expand

Learning Convolutional Neural Networks for Graphs

- Computer Science, Mathematics
- ICML
- 2016

This work proposes a framework for learning convolutional neural networks for arbitrary graphs that operate on locally connected regions of the input and demonstrates that the learned feature representations are competitive with state of the art graph kernels and that their computation is highly efficient. Expand

Generalization and Representational Limits of Graph Neural Networks

- Computer Science, Mathematics
- ICML
- 2020

This work proves that several important graph properties cannot be computed by GNNs that rely entirely on local information, and provides the first data dependent generalization bounds for message passing Gnns. Expand

Combining Label Propagation and Simple Models Out-performs Graph Neural Networks

- Computer Science
- ArXiv
- 2020

This work shows that for many standard transductive node classification benchmarks, it can exceed or match the performance of state-of-the-art GNNs by combining shallow models that ignore the graph structure with two simple post-processing steps that exploit correlation in the label structure. Expand

Representation Learning on Graphs with Jumping Knowledge Networks

- Computer Science, Mathematics
- ICML
- 2018

This work explores an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation in graphs. Expand

Semi-Supervised Classification with Graph Convolutional Networks

- Computer Science, Mathematics
- ICLR
- 2017

A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin. Expand

Graph Attention Networks

- Mathematics, Computer Science
- ICLR
- 2018

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior… Expand

A Comprehensive Survey on Graph Neural Networks

- Computer Science, Mathematics
- IEEE Transactions on Neural Networks and Learning Systems
- 2019

This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns. Expand