Permute Me Softly: Learning Soft Permutations for Graph Representations

@article{Nikolentzos2021PermuteMS,
  title={Permute Me Softly: Learning Soft Permutations for Graph Representations},
  author={Giannis Nikolentzos and George Dasoulas and Michalis Vazirgiannis},
  journal={IEEE transactions on pattern analysis and machine intelligence},
  year={2021},
  volume={PP}
}
Graph neural networks (GNNs) have recently emerged as a dominant paradigm for machine learning with graphs. Research on GNNs has mainly focused on the family of message passing neural networks (MPNNs). Similar to the Weisfeiler-Leman (WL) test of isomorphism, these models follow an iterative neighborhood aggregation procedure to update vertex representations, and they next compute graph representations by aggregating the representations of the vertices. Although very successful, MPNNs have been… 

Figures and Tables from this paper

Weisfeiler and Leman go Hyperbolic: Learning Distance Preserving Node Representations

A distance function between nodes which is based on the hierarchy produced by the WL algorithm, and a model that learns representations which preserve those distances between nodes is proposed which achieves competitive performance with state-of-the-art models.

A graph neural network framework for mapping histological topology in oral mucosal tissue

A two stage machine learning pipeline for generating a cell-graph from a digital H &E stained tissue image is shown to predict both low- and high-level histological features in oral mucosal tissue with good accuracy.

References

SHOWING 1-10 OF 63 REFERENCES

Random Walk Graph Neural Networks

A more intuitive and transparent architecture for graph-structured data, so-called Random Walk Graph Neural Network (RWNN), which consists of a number of trainable “hidden graphs” which are compared against the input graphs using a random walk kernel to produce graph representations.

Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks

It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.

Parameterized Hypercomplex Graph Neural Networks for Graph Classification

This work develops graph neural networks that leverage the properties of hypercomplex feature transformation and presents empirical evidence that the proposed model incorporates a regularization effect, alleviating the risk of overfitting.

How Powerful are Graph Neural Networks?

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

Provably Powerful Graph Networks

This paper proposes a simple model that interleaves applications of standard Multilayer-Perceptron (MLP) applied to the feature dimension and matrix multiplication and shows that a reduced 2-order network containing just scaled identity operator, augmented with a single quadratic operation (matrix multiplication) has a provable 3-WL expressive power.

Hierarchical Graph Representation Learning with Differentiable Pooling

DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion.

Relational Pooling for Graph Representations

This work generalizes graph neural networks (GNNs) beyond those based on the Weisfeiler-Lehman (WL) algorithm, graph Laplacians, and diffusions to provide a framework with maximal representation power for graphs.

Weisfeiler and Leman go sparse: Towards scalable higher-order graph embeddings

The experimental study confirms that the local algorithms, both kernel and neural architectures, lead to vastly reduced computation times, and prevent overfitting, and the kernel version establishes a new state-of-the-art for graph classification on a wide range of benchmark datasets.

An End-to-End Deep Learning Architecture for Graph Classification

This paper designs a localized graph convolution model and shows its connection with two graph kernels, and designs a novel SortPooling layer which sorts graph vertices in a consistent order so that traditional neural networks can be trained on the graphs.

Directional Graph Networks

The first method that exploits vector flows over graphs to develop globally consistent directional and asymmetric aggregation functions is proposed and it is shown that the directional graph networks (DGNs) generalize convolutional neural networks (CNNs) when applied on a grid.
...