# Mincut pooling in Graph Neural Networks

@article{Bianchi2019MincutPI, title={Mincut pooling in Graph Neural Networks}, author={Filippo Maria Bianchi and Daniele Grattarola and Cesare Alippi}, journal={ArXiv}, year={2019}, volume={abs/1907.00481} }

The advance of node pooling operations in Graph Neural Networks (GNNs) has lagged behind the feverish design of new message-passing techniques, and pooling remains an important and challenging endeavor for the design of deep architectures. In this paper, we propose a pooling operation for GNNs that leverages a differentiable unsupervised loss based on the mincut optimization objective. For each node, our method learns a soft cluster assignment vector that depends on the node features, the… Expand

#### Figures, Tables, and Topics from this paper

#### 15 Citations

Deep Graph Mapper: Seeing Graphs Through the Neural Lens

- Computer Science, Medicine
- Frontiers in Big Data
- 2021

This work proves the suitability of Mapper as a topological framework for graph pooling by proving that Mapper is a generalization of pooling methods based on soft cluster assignments and shows how easy it is to design novel pooling algorithms that obtain competitive results with other state-of-the-art methods. Expand

Graph Pooling via Coarsened Graph Infomax

- Computer Science
- SIGIR
- 2021

Coarsened Grap hInfomaxPooling (CGIPool) is proposed that maximizes the mutual information between the input and the coarsened graph of each pooling layer to preserve graph-level dependencies and applies contrastive learning and a self-attention-based algorithm for learning positive and negative samples. Expand

Memory-Based Graph Networks

- Computer Science, Mathematics
- ICLR
- 2020

An efficient memory layer for GNNs is introduced that can learn to jointly perform graph representation learning and graph pooling and two new networks based on this memory layer are introduced: Memory-Based Graph Neural Network (MemGNN) and Graph Memory Network (GMN) that can learning hierarchical graph representations by coarsening the graph throughout the layers of memory. Expand

) MLP Queries Graph Label Input Graph Queries Memory Layer Conv

- 2019

Graph Neural Networks (GNNs) are deep models that operate on data with arbitrary topology represented as graphs. We introduce an efficient memory layer for GNNs that can jointly learn node… Expand

WHAT GRAPH NEURAL NETWORKS CANNOT LEARN: DEPTH

- 2019

This paper studies the expressive power of graph neural networks falling within the message-passing framework (GNNmp). Two results are presented. First, GNNmp are shown to be Turing universal under… Expand

What graph neural networks cannot learn: depth vs width

- Computer Science, Mathematics
- ICLR
- 2020

GNNmp are shown to be Turing universal under sufficient conditions on their depth, width, node attributes, and layer expressiveness, and it is discovered that GNNmp can lose a significant portion of their power when their depth and width is restricted. Expand

Multi-hop Graph Convolutional Network with High-order Chebyshev Approximation for Text Reasoning

- Computer Science
- ACL/IJCNLP
- 2021

The spectral graph convolutional network with the high-order dynamic Chebyshev approximation (HDGCN), which augments the multi-hop graph reasoning by fusing messages aggregated from direct and long-term dependencies into one convolutionAL layer is defined. Expand

Learning Feature Aggregation for Deep 3D Morphable Models

- Computer Science
- CVPR
- 2021

This work focuses on deep 3D morphable models that directly apply deep learning on 3D mesh data with a hierarchical structure to capture information at multiple scales, and proposes an attention based module to learn mapping matrices for better feature aggregation across hierarchical levels. Expand

Clustered Dynamic Graph CNN for Biometric 3D Hand Shape Recognition

- Computer Science
- 2020 IEEE International Joint Conference on Biometrics (IJCB)
- 2020

This work proposes a novel approach to 3D hand shape recognition from RGB-D data based on geometric deep learning techniques and shows encouraging performance compared to diverse baselines on the new data, as well as current benchmark dataset HKPolyU. Expand

Fine-Grained Urban Flow Prediction

- Computer Science
- WWW
- 2021

This work presents a Spatio-Temporal Relation Network (STRN), a Global Relation Module (GloNet) that captures global spatial dependencies much more efficiently compared to existing methods, and a Meta Learner that takes external factors and land functions as inputs to produce meta knowledge and boost model performances. Expand

#### References

SHOWING 1-10 OF 50 REFERENCES

Hierarchical Graph Representation Learning with Differentiable Pooling

- Computer Science, Mathematics
- NeurIPS
- 2018

DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion. Expand

Clique pooling for graph classification

- Computer Science, Mathematics
- ArXiv
- 2019

A novel graph pooling operation using cliques as the unit pool is proposed, more readily interpretable, a better analogue to image coarsening than filtering or pruning techniques, and entirely nonparametric. Expand

Self-Attention Graph Pooling

- Computer Science, Mathematics
- ICML
- 2019

This paper proposes a graph pooling method based on self-attention using graph convolution, which achieves superior graph classification performance on the benchmark datasets using a reasonable number of parameters. Expand

Fast and Deep Graph Neural Networks

- Computer Science, Mathematics
- AAAI
- 2020

It is shown that even without training of the recurrent connections, the architecture of small deep GNN is surprisingly able to achieve or improve the state-of-the-art performance on a significant set of tasks in the field of graphs classification. Expand

Graph Attention Networks

- Mathematics, Computer Science
- ICLR
- 2018

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior… Expand

Towards Sparse Hierarchical Graph Classifiers

- Mathematics, Computer Science
- ArXiv
- 2018

This work combines several recent advances in graph neural network design to demonstrate that competitive hierarchical graph classification results are possible without sacrificing sparsity. Expand

Dynamic Edge-Conditioned Filters in Convolutional Neural Networks on Graphs

- Computer Science
- 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2017

This work generalizes the convolution operator from regular grids to arbitrary graphs while avoiding the spectral domain, which allows us to handle graphs of varying size and connectivity. Expand

Convolutional Neural Network Architectures for Signals Supported on Graphs

- Computer Science, Engineering
- IEEE Transactions on Signal Processing
- 2019

Two architectures that generalize convolutional neural networks (CNNs) for the processing of signals supported on graphs are introduced and Multinode aggregation GNNs are consistently the best-performing GNN architecture for operation in large-scale graphs. Expand

Graph Neural Networks with convolutional ARMA filters

- Computer Science, Mathematics
- IEEE transactions on pattern analysis and machine intelligence
- 2021

A novel graph convolutional layer inspired by the auto-regressive moving average (ARMA) filter is proposed that provides a more flexible frequency response, is more robust to noise, and better captures the global graph structure. Expand

Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

- Computer Science, Mathematics
- NIPS
- 2016

This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs. Expand