Corpus ID: 195766901

Mincut pooling in Graph Neural Networks

@article{Bianchi2019MincutPI,
  title={Mincut pooling in Graph Neural Networks},
  author={Filippo Maria Bianchi and Daniele Grattarola and Cesare Alippi},
  journal={ArXiv},
  year={2019},
  volume={abs/1907.00481}
}
The advance of node pooling operations in Graph Neural Networks (GNNs) has lagged behind the feverish design of new message-passing techniques, and pooling remains an important and challenging endeavor for the design of deep architectures. In this paper, we propose a pooling operation for GNNs that leverages a differentiable unsupervised loss based on the mincut optimization objective. For each node, our method learns a soft cluster assignment vector that depends on the node features, the… Expand
Deep Graph Mapper: Seeing Graphs Through the Neural Lens
TLDR
This work proves the suitability of Mapper as a topological framework for graph pooling by proving that Mapper is a generalization of pooling methods based on soft cluster assignments and shows how easy it is to design novel pooling algorithms that obtain competitive results with other state-of-the-art methods. Expand
Graph Pooling via Coarsened Graph Infomax
TLDR
Coarsened Grap hInfomaxPooling (CGIPool) is proposed that maximizes the mutual information between the input and the coarsened graph of each pooling layer to preserve graph-level dependencies and applies contrastive learning and a self-attention-based algorithm for learning positive and negative samples. Expand
Memory-Based Graph Networks
TLDR
An efficient memory layer for GNNs is introduced that can learn to jointly perform graph representation learning and graph pooling and two new networks based on this memory layer are introduced: Memory-Based Graph Neural Network (MemGNN) and Graph Memory Network (GMN) that can learning hierarchical graph representations by coarsening the graph throughout the layers of memory. Expand
) MLP Queries Graph Label Input Graph Queries Memory Layer Conv
  • 2019
Graph Neural Networks (GNNs) are deep models that operate on data with arbitrary topology represented as graphs. We introduce an efficient memory layer for GNNs that can jointly learn nodeExpand
WHAT GRAPH NEURAL NETWORKS CANNOT LEARN: DEPTH
This paper studies the expressive power of graph neural networks falling within the message-passing framework (GNNmp). Two results are presented. First, GNNmp are shown to be Turing universal underExpand
What graph neural networks cannot learn: depth vs width
TLDR
GNNmp are shown to be Turing universal under sufficient conditions on their depth, width, node attributes, and layer expressiveness, and it is discovered that GNNmp can lose a significant portion of their power when their depth and width is restricted. Expand
Multi-hop Graph Convolutional Network with High-order Chebyshev Approximation for Text Reasoning
TLDR
The spectral graph convolutional network with the high-order dynamic Chebyshev approximation (HDGCN), which augments the multi-hop graph reasoning by fusing messages aggregated from direct and long-term dependencies into one convolutionAL layer is defined. Expand
Learning Feature Aggregation for Deep 3D Morphable Models
TLDR
This work focuses on deep 3D morphable models that directly apply deep learning on 3D mesh data with a hierarchical structure to capture information at multiple scales, and proposes an attention based module to learn mapping matrices for better feature aggregation across hierarchical levels. Expand
Clustered Dynamic Graph CNN for Biometric 3D Hand Shape Recognition
TLDR
This work proposes a novel approach to 3D hand shape recognition from RGB-D data based on geometric deep learning techniques and shows encouraging performance compared to diverse baselines on the new data, as well as current benchmark dataset HKPolyU. Expand
Fine-Grained Urban Flow Prediction
TLDR
This work presents a Spatio-Temporal Relation Network (STRN), a Global Relation Module (GloNet) that captures global spatial dependencies much more efficiently compared to existing methods, and a Meta Learner that takes external factors and land functions as inputs to produce meta knowledge and boost model performances. Expand
...
1
2
...

References

SHOWING 1-10 OF 50 REFERENCES
Hierarchical Graph Representation Learning with Differentiable Pooling
TLDR
DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion. Expand
Clique pooling for graph classification
TLDR
A novel graph pooling operation using cliques as the unit pool is proposed, more readily interpretable, a better analogue to image coarsening than filtering or pruning techniques, and entirely nonparametric. Expand
Self-Attention Graph Pooling
TLDR
This paper proposes a graph pooling method based on self-attention using graph convolution, which achieves superior graph classification performance on the benchmark datasets using a reasonable number of parameters. Expand
Fast and Deep Graph Neural Networks
TLDR
It is shown that even without training of the recurrent connections, the architecture of small deep GNN is surprisingly able to achieve or improve the state-of-the-art performance on a significant set of tasks in the field of graphs classification. Expand
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of priorExpand
Towards Sparse Hierarchical Graph Classifiers
TLDR
This work combines several recent advances in graph neural network design to demonstrate that competitive hierarchical graph classification results are possible without sacrificing sparsity. Expand
Dynamic Edge-Conditioned Filters in Convolutional Neural Networks on Graphs
TLDR
This work generalizes the convolution operator from regular grids to arbitrary graphs while avoiding the spectral domain, which allows us to handle graphs of varying size and connectivity. Expand
Convolutional Neural Network Architectures for Signals Supported on Graphs
TLDR
Two architectures that generalize convolutional neural networks (CNNs) for the processing of signals supported on graphs are introduced and Multinode aggregation GNNs are consistently the best-performing GNN architecture for operation in large-scale graphs. Expand
Graph Neural Networks with convolutional ARMA filters
TLDR
A novel graph convolutional layer inspired by the auto-regressive moving average (ARMA) filter is proposed that provides a more flexible frequency response, is more robust to noise, and better captures the global graph structure. Expand
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
TLDR
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs. Expand
...
1
2
3
4
5
...