Graph Neural Networks: A Review of Methods and Applications

@article{Zhou2020GraphNN,
  title={Graph Neural Networks: A Review of Methods and Applications},
  author={Jie Zhou and Ganqu Cui and Zhengyan Zhang and Cheng Yang and Zhiyuan Liu and Maosong Sun},
  journal={ArXiv},
  year={2020},
  volume={abs/1812.08434}
}

Figures and Tables from this paper

Graph convolutional networks: a comprehensive review
TLDR
A comprehensive review specifically on the emerging field of graph convolutional networks, which is one of the most prominent graph deep learning models, is conducted and several open challenges are presented and potential directions for future research are discussed.
Co-embedding of Nodes and Edges with Graph Neural Networks
TLDR
CensNet, Convolution with Edge-Node Switching graph neural network for learning tasks in graph-structured data with both node and edge features, and two novel graph convolution operations are proposed for feature propagation.
Graph Neural Networks: Methods, Applications, and Opportunities
TLDR
This article provides a comprehensive survey of graph neural networks (GNNs) in each learning setting: supervised, unsupervised, semi- supervised, and self-supervised learning.
Graph Convolutional Neural Networks via Motif-based Attention
TLDR
A novel framework for learning spatial and attentional convolution neural networks on arbitrary graphs is proposed, which first design a motif-matching guided subgraph normalization method to capture neighborhood information, and implements subgraph-level self-attentional layers to solve graph classification problems.
Computing Graph Neural Networks: A Survey from Algorithms to Accelerators
TLDR
A review of the field of GNNs is presented from the perspective of computing, and an in-depth analysis of current software and hardware acceleration schemes is provided, from which a hardware-software, graph-aware, and communication-centric vision for GNN accelerators is distilled.
Machine Learning on Graphs: A Model and Comprehensive Taxonomy
TLDR
A comprehensive taxonomy of representation learning methods for graph-structured data is proposed, aiming to unify several disparate bodies of work and provide a solid foundation for understanding the intuition behind these methods, and enables future research in the area.
Graph neural networks for multimodal learning and representation
TLDR
This thesis introduces a generative graph neural network model based on reinforcement learning and recurrent neural networks (RNNs) to extract a structured representation from sensory data and introduces a new neural network architecture, Multimodal Neural Graph Memory Networks (MN-GMN), for the VQA task.
Theory and Applications of Graph Neural Networks in Knowledge Acquisition
Knowledge acquisition is a process in which an artificial intelligence system acquires relevant knowledge from datasets. Knowledge acquisition extracts and forms knowledge from data of different
Graph Neural Networks for Node-Level Predictions
TLDR
This work aims to provide an overview of early and modern graph neural network based machine learning methods for node-level prediction tasks and explains the core concepts and detailed explanations for convolutional methods that have had strong impact.
...
...

References

SHOWING 1-10 OF 382 REFERENCES
Graph convolutional networks: a comprehensive review
TLDR
A comprehensive review specifically on the emerging field of graph convolutional networks, which is one of the most prominent graph deep learning models, is conducted and several open challenges are presented and potential directions for future research are discussed.
Graph Transformer Networks
TLDR
This paper proposes Graph Transformer Networks (GTNs) that are capable of generating new graph structures, which involve identifying useful connections between unconnected nodes on the original graph, while learning effective node representation on the new graphs in an end-to-end fashion.
A Comprehensive Survey on Graph Neural Networks
TLDR
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.
The Graph Neural Network Model
TLDR
A new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains, and implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space.
Machine Learning on Graphs: A Model and Comprehensive Taxonomy
TLDR
A comprehensive taxonomy of representation learning methods for graph-structured data is proposed, aiming to unify several disparate bodies of work and provide a solid foundation for understanding the intuition behind these methods, and enables future research in the area.
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training
TLDR
Graph Contrastive Coding (GCC) is designed --- a self-supervised graph neural network pre-training framework --- to capture the universal network topological properties across multiple networks and leverage contrastive learning to empower graph neural networks to learn the intrinsic and transferable structural representations.
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Graph Convolutional Networks with EigenPooling
TLDR
A pooling operator based on graph Fourier transform is introduced, which can utilize the node features and local structures during the pooling process and is combined with traditional GCN convolutional layers to form a graph neural network framework for graph classification.
Representation Learning on Graphs: Methods and Applications
TLDR
A conceptual review of key advancements in this area of representation learning on graphs, including matrix factorization-based methods, random-walk based algorithms, and graph neural networks are provided.
Graph Neural Networks Exponentially Lose Expressive Power for Node Classification
TLDR
The theory enables us to relate the expressive power of GCNs with the topological information of the underlying graphs inherent in the graph spectra and provides a principled guideline for weight normalization of graph NNs.
...
...