• Corpus ID: 227342964

Counting Substructures with Higher-Order Graph Neural Networks: Possibility and Impossibility Results

@article{Tahmasebi2020CountingSW,
  title={Counting Substructures with Higher-Order Graph Neural Networks: Possibility and Impossibility Results},
  author={Behrooz Tahmasebi and Stefanie Jegelka},
  journal={ArXiv},
  year={2020},
  volume={abs/2012.03174}
}
While massage passing based Graph Neural Networks (GNNs) have become increasingly popular architectures for learning with graphs, recent works have revealed important shortcomings in their expressive power. In response, several higher-order GNNs have been proposed, which substantially increase the expressive power, but at a large computational cost. Motivated by this gap, we introduce and analyze a new recursive pooling technique of local neighborhoods that allows different tradeoffs of… 

Figures and Tables from this paper

Graph Neural Networks with Local Graph Parameters

This work describes local graph parameter enabled GNN s as a framework for studying “higher-order” GNNs and their higher-order counterparts, and precisely characterize their distinguishing power, in terms of a variant of the WL test, and of the graph structural properties that they can take into account.

Equivariant Subgraph Aggregation Networks

A novel framework to represent each graph as a set of subgraphs derived by some predefined policy, and to process it using a suitable equivariant architecture, and it is proved that this approach increases the expressive power of both MPNNs and more expressive architectures.

From Stars to Subgraphs: Uplifting Any GNN with Local Structure Awareness

This work introduces a general framework to uplift any MPNN to be more expressive, with limited scalability overhead and greatly improved practical performance, and calls the framework GNN-AK (GNN As Kernel), as the framework resembles a convolutional neural network by replacing the kernel with GNNs.

Going Deeper into Permutation-Sensitive Graph Neural Networks

This work devise an efficient permutation-sensitive aggregation mechanism via permutation groups, capturing pairwise correlations between neighboring nodes, and proves that this approach is strictly more powerful than the 2-dimensional Weisfeiler-Lehman (2-WL) graph isomorphism test and not lesspowerful than the 3-Wl test.

F ROM S TARS TO S UBGRAPHS : U PLIFTING A NY GNN WITH L OCAL S TRUCTURE A WARENESS

This work introduces a general framework to uplift any MPNN to be more expressive, with limited scalability overhead and greatly improved practical performance, and calls the framework GNN-AK (GNN As Kernel), as the framework resembles a convolutional neural network by replacing the kernel with GNNs.

Sign and Basis Invariant Networks for Spectral Graph Representation Learning

SignNet and BasisNet are introduced — new neural architectures that are invariant to all requisite symmetries and hence process collections of eigenspaces in a principled manner and can approximate any continuous function of eigenvectors with the proper invariances.

LMKG: Learned Models for Cardinality Estimation in Knowledge Graphs

This paper develops a framework, termed LMKG, that adopts deep learning approaches for eectively estimating the cardinality of queries over RDF graphs and employs both supervised and unsupervised approaches that adapt to the subgraph patterns and produce more accurate cardinality estimates.

Theory of Graph Neural Networks: Representation and Learning

A selection of the emerging theoretical results on approximation and learning properties of widely used message passing GNNs and higher-order GNN’s, focusing on representation, generalization and extrapolation are summarized.

Boosting the Cycle Counting Power of Graph Neural Networks with I2-GNNs

It is proved that Subgraph MPNNs fail to count more-than-4-cycles at node level, implying that node representations cannot correctly encode the surrounding substructures like ring systems with more than four atoms, and I$^2-GNNs is proven capable of counting all 3, 4, 5 and 6-cycles, covering common substructureures like benzene rings in organic chemistry, while still keeping linear complexity.

Adjacency Matrix Node Features Laplacian Eigenvectors SignNet Prediction Model ( e . g . GNN , Transformer ) Compute Eigvecs Input Graph Model

  • 2022

References

SHOWING 1-10 OF 57 REFERENCES

The Surprising Power of Graph Neural Networks with Random Node Initialization

This paper proves that GNNs with RNI are universal, a first such result for GNN's not relying on computationally demanding higher-order properties, and empirically analyzes the effect of RNI on GNN’s, finding that the empirical findings support the superior performance of GNNS with R NI over standard Gnns.

Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting

The expressive power of the proposed Graph Substructure Networks (GSN), a topologically-aware message passing scheme based on substructure encoding, is theoretically analysed, showing that it is strictly more expressive than the WL test, and provide sufficient conditions for universality.

Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks

It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.

Can graph neural networks count substructures?

A local relational pooling approach with inspirations from Murphy et al. (2019) is proposed and demonstrated that it is not only effective for substructure counting but also able to achieve competitive performance on real-world tasks.

Provably Powerful Graph Networks

This paper proposes a simple model that interleaves applications of standard Multilayer-Perceptron (MLP) applied to the feature dimension and matrix multiplication and shows that a reduced 2-order network containing just scaled identity operator, augmented with a single quadratic operation (matrix multiplication) has a provable 3-WL expressive power.

Reconstruction for Powerful Graph Representations

This work shows the extent to which graph reconstruction—reconstructing a graph from its subgraphs—can mitigate the theoretical and practical problems currently faced by GRL architectures and demonstrates how it boosts state-of-the-art GNN’s performance across nine real-world benchmark datasets.

On the equivalence between graph isomorphism testing and function approximation with GNNs

It is proved that order-2 Graph G-invariant networks fail to distinguish non-isomorphic regular graphs with the same degree, and is extended to a new architecture, Ring-GNNs, which succeeds on distinguishing these graphs and provides improvements on real-world social network datasets.

Characterizing the Expressive Power of Invariant and Equivariant Graph Neural Networks

It is proved that the first approximation guarantees for practical GNNs are proved, paving the way for a better understanding of their generalization.

How Powerful are Graph Neural Networks?

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

Subgraph Pattern Neural Networks for High-Order Graph Evolution Prediction

This work generalizes traditional node/link prediction tasks in dynamic heterogeneous networks, to consider joint prediction over larger k-node induced subgraphs and significantly outperforms other state-of-the-art methods designed for static and/or single node/ link prediction tasks.
...