Label Propagation across Graphs: Node Classification using Graph Neural Tangent Kernels

  title={Label Propagation across Graphs: Node Classification using Graph Neural Tangent Kernels},
  author={Artun Bayer and Arindam Chowdhury and Santiago Segarra},
Graph neural networks (GNNs) have achieved superior performance on node classification tasks in the last few years. Commonly, this is framed in a transductive semi-supervised learning setup wherein the entire graph – including the target nodes to be labeled – is available for training. Driven in part by scalability, recent works have focused on the inductive case where only the labeled portion of a graph is available for training. In this context, our current work considers a challenging… 

Figures from this paper


Graph Neural Networks Exponentially Lose Expressive Power for Node Classification
The theory enables us to relate the expressive power of GCNs with the topological information of the underlying graphs inherent in the graph spectra and provides a principled guideline for weight normalization of graph NNs.
Are Powerful Graph Neural Nets Necessary? A Dissection on Graph Classification
This work proposes Graph Feature Network (GFN), a simple lightweight neural net defined on a set of graph augmented features, and proves that GFN can be derived by linearizing graph filtering part of GNNs, and leverages it to test the importance of the two parts separately.
Hierarchical Graph Convolutional Networks for Semi-supervised Node Classification
A novel deep Hierarchical Graph Convolutional Network (H-GCN) for semi-supervised node classification, which first repeatedly aggregates structurally similar nodes to hyper-nodes and then refines the coarsened graph to the original to restore the representation for each node.
Inductive Representation Learning on Large Graphs
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Graph Neural Tangent Kernel: Fusing Graph Neural Networks with Graph Kernels
A new class of graph kernels, Graph Neural Tangent Kernels (GNTKs), which correspond to infinitely wide multi-layer GNNs trained by gradient descent are presented, which enjoy the full expressive power ofGNNs and inherit advantages of GKs.
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
A Comprehensive Survey on Graph Neural Networks
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.
FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling
Enhanced with importance sampling, FastGCN not only is efficient for training but also generalizes well for inference, and is orders of magnitude more efficient while predictions remain comparably accurate.
Semi-Supervised Classification with Graph Convolutional Networks
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Deep Learning on Graphs: A Survey
This survey comprehensively review the different types of deep learning methods on graphs by dividing the existing methods into five categories based on their model architectures and training strategies: graph recurrent neural networks, graph convolutional networks,graph autoencoders, graph reinforcement learning, and graph adversarial methods.