• Corpus ID: 225067240

Towards Scale-Invariant Graph-related Problem Solving by Iterative Homogeneous Graph Neural Networks

@article{Tang2020TowardsSG,
  title={Towards Scale-Invariant Graph-related Problem Solving by Iterative Homogeneous Graph Neural Networks},
  author={Hao Tang and Zhiao Huang and Jia-Yuan Gu and Bao-Liang Lu and Hao Su},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.13547}
}
Current graph neural networks (GNNs) lack generalizability with respect to scales (graph sizes, graph diameters, edge weights, etc..) when solving many graph analysis problems. Taking the perspective of synthesizing graph theory programs, we propose several extensions to address the issue. First, inspired by the dependency of the iteration number of common graph theory algorithms on graph size, we learn to terminate the message passing process in GNNs adaptively according to the computation… 

From Local Structures to Size Generalization in Graph Neural Networks

An important type of data where generalization from small to large graphs is challenging: graph distributions for which the local structure depends on the graph size, which is proved that when there is a difference between the local structures, GNNs are not guaranteed to generalize across sizes.

On Size Generalization in Graph Neural Networks

It is shown that even for very simple tasks, GNNs do not naturally generalize to graphs of larger size, and their generalization performance is closely related to the distribution of patterns of connectivity and features and how that distribution changes from small to large graphs.

Relational Attention: Generalizing Transformers for Graph-Structured Tasks

This paper generalizes transformer attention to consider and update edge vectors in each transformer layer, and demonstrates that this relational transformer dramatically outperforms state-of-the-art graph neural networks expressly designed to reason over graph-structured data.

Towards Better Out-of-Distribution Generalization of Neural Algorithmic Reasoning Tasks

An attention-based 2WL-graph neural network (GNN) processor which complements message passing GNNs so their combination outperforms the state-of-the-art model by a 3% margin averaged over all algorithms.

Pathfinding Neural Cellular Automata

It is shown that adversarially evolving mazes leads to increased generalization on out-of-distribution examples, while at the same time generating data-sets with significantly more complex solutions for reasoning tasks.

Reasoning-Modulated Representations

A common setting where their task is not purely opaque is studied, and very often the authors may have access to information about the underlying system (e.g. very often the system may be purely opaque).

Persistent Message Passing

This paper shows on dynamic range querying that the method provides benevolence to overwriting-based GNNs, both in- and out-of-distribution, in general-purpose algorithms that can be neurally executed.

Revisiting Transformation Invariant Geometric Deep Learning: Are Initial Representations All You Need?

This work revisits why the existing neural networks cannot maintain transformation invariance when handling geometric data and proposes Transformation Invariant Neural Networks (TinvNN), a straightforward and general framework for geometric data.

References

SHOWING 1-10 OF 65 REFERENCES

How Powerful are Graph Neural Networks?

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks

It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.

Pointer Graph Networks

Pointer Graph Networks (PGNs) are introduced which augment sets or graphs with additional inferred edges for improved model expressivity and can learn parallelisable variants of pointer-based data structures, namely disjoint set unions and link/cut trees.

EdgeNets: Edge Varying Graph Neural Networks

A general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet is put forth and it is shown that GATs are GCNNs on a graph that is learned from the features, which opens the doors to develop alternative attention mechanisms for improving discriminatory power.

Towards Sparse Hierarchical Graph Classifiers

This work combines several recent advances in graph neural network design to demonstrate that competitive hierarchical graph classification results are possible without sacrificing sparsity.

Neural Execution of Graph Algorithms

It is demonstrated how learning in the space of algorithms can yield new opportunities for positive transfer between tasks, showing how learning a shortest-path algorithm can be substantially improved when simultaneously learning a reachability algorithm.

Relational Pooling for Graph Representations

This work generalizes graph neural networks (GNNs) beyond those based on the Weisfeiler-Lehman (WL) algorithm, graph Laplacians, and diffusions to provide a framework with maximal representation power for graphs.

Gated Graph Sequence Neural Networks

This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.

Representation Learning on Graphs with Jumping Knowledge Networks

This work explores an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation in graphs.
...