• Corpus ID: 131777802

PAN: Path Integral Based Convolution for Deep Graph Neural Networks

  title={PAN: Path Integral Based Convolution for Deep Graph Neural Networks},
  author={Zheng Ma and Ming Li and Yuguang Wang},
Convolution operations designed for graph-structured data usually utilize the graph Laplacian, which can be seen as message passing between the adjacent neighbors through a generic random walk. In this paper, we propose PAN, a new graph convolution framework that involves every path linking the message sender and receiver with learnable weights depending on the path length, which corresponds to the maximal entropy random walk. PAN generalizes the graph Laplacian to a new transition matrix we… 

Figures and Tables from this paper

Path integral based convolution and pooling for graph neural networks

This work proposes path integral-based GNNs (PAN), a versatile framework that can be tailored for different graph data with varying sizes and structures, and achieves state-of-the-art performance on various graph classification/regression tasks.

Graph Neural Networks with Haar Transform-Based Convolution and Pooling: A Complete Guide

This work proposes a novel graph neural network, which it calls HaarNet, to predict graph labels with interrelated convolution and pooling strategies, which outperforms various existing GNN models, especially on big data sets.

Haar Transforms for Graph Neural Networks

The Haar basis is introduced, a sparse and localized orthonormal system for graph, constructed from a coarse-grained chain on the graph, which allows Fast Haar Transforms on graph, by which a fast evaluation of Haar convolution between the graph signals and the filters can be achieved.

Fast Haar Transforms for Graph Neural Networks

Diffusion Improves Graph Learning

This work removes the restriction of using only the direct neighbors by introducing a powerful, yet spatially localized graph convolution: Graph diffusion convolution (GDC), which leverages generalized graph diffusion and alleviates the problem of noisy and often arbitrarily defined edges in real graphs.

Haar Graph Pooling

A new graph pooling operation based on compressive Haar transforms -- HaarPooling is proposed, which synthesizes the features of any given input graph into a feature vector of uniform size.

Adaptive Graph Diffusion Networks with Hop-wise Attention

This work proposes Adaptive Graph Diffusion Networks with Hop-wise Attention (AGDNs-HA), which stacks multi-hop neighborhood aggregations of different orders into single layer with the help of hop-wise attention, which is learnable and adaptive for each node.


  • Computer Science
  • 2019
This work proposes a new graph pooling operation based on compressive Haar transforms, called HaarPooling, which achieves state-of-the-art performance on diverse graph classification problems.

HaarPooling: Graph Pooling with Compressive Haar Basis

A new graph pooling operation based on compressive Haar transforms, called HaarPooling, is proposed, which achieves state-of-the-art performance on diverse graph classification problems.



Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.

Dynamic Filters in Graph Convolutional Networks

This work proposes a novel graph-convolutional network architecture that builds on a generic formulation that relaxes the 1-to-1 correspondence between filter weights and data elements around the center of the convolution.

LanczosNet: Multi-Scale Deep Graph Convolutional Networks

The Lanczos network (LanczosNet) is proposed, which uses the Lanczos algorithm to construct low rank approximations of the graph Laplacian for graph convolution and facilitates both graph kernel learning as well as learning node embeddings.

Robust Spatial Filtering With Graph Convolutional Neural Networks

A novel neural learning framework that is capable of handling both homogeneous and heterogeneous data while retaining the benefits of traditional CNN successes is proposed, which is term Graph-CNNs, which can handle both heterogeneous and homogeneous graph data.

Diffusion-Convolutional Neural Networks

Through the introduction of a diffusion-convolution operation, it is shown how diffusion-based representations can be learned from graph-structured data and used as an effective basis for node classification.

Attention-based Graph Neural Network for Semi-supervised Learning

A novel graph neural network is proposed that removes all the intermediate fully-connected layers, and replaces the propagation layers with attention mechanisms that respect the structure of the graph, and demonstrates that this approach outperforms competing methods on benchmark citation networks datasets.

Rethinking Knowledge Graph Propagation for Zero-Shot Learning

This work proposes a Dense Graph Propagation module with carefully designed direct links among distant nodes to exploit the hierarchical graph structure of the knowledge graph through additional connections and outperforms state-of-the-art zero-shot learning approaches.

Gated Graph Sequence Neural Networks

This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.

Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior

Spectral Networks and Locally Connected Networks on Graphs

This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.