• Corpus ID: 47018956

Representation Learning on Graphs with Jumping Knowledge Networks

@article{Xu2018RepresentationLO,
  title={Representation Learning on Graphs with Jumping Knowledge Networks},
  author={Keyulu Xu and Chengtao Li and Yonglong Tian and Tomohiro Sonobe and Ken-ichi Kawarabayashi and Stefanie Jegelka},
  journal={ArXiv},
  year={2018},
  volume={abs/1806.03536}
}
Recent deep learning approaches for representation learning on graphs follow a neighborhood aggregation procedure. [] Key Method To adapt to local neighborhood properties and tasks, we explore an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation. In a number of experiments on social, bioinformatics and citation networks, we demonstrate that our model achieves state-of-the-art performance…

Figures and Tables from this paper

Graphs, Entities, and Step Mixture
TLDR
A new graph neural network that considers both edge-based neighborhood relationships and node-based entity features, i.e. Graph Entities with Step Mixture via random walk (GESM), which achieves state-of-the-art or comparable performances on eight benchmark graph datasets comprising transductive and inductive learning tasks.
Graph Partner Neural Networks for Semi-Supervised Learning on Graphs
TLDR
This work proposes the Graph Partner Neural Network (GPNN) which incorporates a de-parameterized GCN and a parameter-sharing MLP and provides empirical and theoretical evidence to demonstrate the effectiveness of the proposed MLP partner on tackling over-smoothing while benefiting from appropriate smoothness.
Graph Neural Networks with Feature and Structure Aware Random Walk
TLDR
This paper generalizes the graph Laplacian to digraph based on the proposed Feature-Aware PageRank algorithm, which simultaneously considers the graph directionality and long-distance feature similarity between nodes, and develops a model that adaptively learns the directionality of the graph, and exploits the underlying long- distance correlations between nodes.
Active Learning for Graph Neural Networks via Node Feature Propagation
TLDR
A new method is proposed, which uses node feature propagation followed by K-Medoids clustering of the nodes for instance selection in active learning in node classification tasks and outperforms other representative baseline methods consistently and significantly.
Discovering Localized Information for Heterogeneous Graph Node Representation Learning
TLDR
This paper proposes a model without any a priori selection of meta-paths, using the classical Graph Convolution Network (GCN) model as a tool to aggregate node features and then aggregate the context graph feature vectors to produce the target node's feature representation.
CogDL: An Extensive Toolkit for Deep Learning on Graphs
TLDR
CogDL1, an extensive research toolkit for deep learning on graphs that allows researchers and developers to easily conduct experiments and build applications, is introduced and the effectiveness of CogDL for real-world applications in AMiner2, which is a large academic database and system.
Relational Pooling for Graph Representations
TLDR
This work generalizes graph neural networks (GNNs) beyond those based on the Weisfeiler-Lehman (WL) algorithm, graph Laplacians, and diffusions to provide a framework with maximal representation power for graphs.
The Impact of Global Structural Information in Graph Neural Networks Applications
TLDR
This work empirically addresses the question of whether practical applications on graph structured data require global structural knowledge or not by giving access to global information to several GNN models, and observing the impact it has on downstream performance.
Robust Graph Neural Networks via Ensemble Learning
TLDR
This paper proposes a novel framework of graph ensemble learning based on knowledge passing (called GEL) to address the nonrobustness and oversmoothing issues of GNNs, and designs a multilayer DropNode propagation strategy to reduce each node’s dependence on particular neighbors.
Graph Random Neural Network
TLDR
This work proposes the consistency regularization for Grand by leveraging the distributional consistency of unlabeled nodes in multiple augmentations, improving the generalization capacity of the model.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 34 REFERENCES
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.
Inductive Representation Learning on Large Graphs
TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Gated Graph Sequence Neural Networks
TLDR
This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
Semi-Supervised Classification with Graph Convolutional Networks
TLDR
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Weisfeiler-Lehman Graph Kernels
TLDR
A family of efficient kernels for large graphs with discrete node labels based on the Weisfeiler-Lehman test of isomorphism on graphs that outperform state-of-the-art graph kernels on several graph classification benchmark data sets in terms of accuracy and runtime.
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
TLDR
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.
DeepWalk: online learning of social representations
TLDR
DeepWalk is an online learning algorithm which builds useful incremental results, and is trivially parallelizable, which make it suitable for a broad class of real world applications such as network classification, and anomaly detection.
LINE: Large-scale Information Network Embedding
TLDR
A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures.
Geometric Deep Learning on Graphs and Manifolds Using Mixture Model CNNs
TLDR
This paper proposes a unified framework allowing to generalize CNN architectures to non-Euclidean domains (graphs and manifolds) and learn local, stationary, and compositional task-specific features and test the proposed method on standard tasks from the realms of image-, graph-and 3D shape analysis and show that it consistently outperforms previous approaches.
...
1
2
3
4
...