Co-embedding of Nodes and Edges with Graph Neural Networks

  title={Co-embedding of Nodes and Edges with Graph Neural Networks},
  author={Xiaodong Jiang and Ronghang Zhu and Pengsheng Ji and Sheng Li},
  journal={IEEE transactions on pattern analysis and machine intelligence},
Graph is ubiquitous in many real world applications ranging from social network analysis to biology. How to correctly and effectively learn and extract information from graph is essential for a large number of machine learning tasks. Graph embedding is a way to transform and encode data structure in high dimensional and Non-Euclidean feature space to a low dimensional and structural space. We have witnessed a huge surge of such embedding methods, from statistical approaches to recent deep… Expand

Figures and Tables from this paper

Automated Graph Learning via Population Based Self-Tuning GCN
A self-tuning GCN approach with an alternate training algorithm, and an approach to automate the training of GCN models through hyperparameter optimization is proposed. Expand
Graph Denoising with Framelet Regularizer
  • Bingxin Zhou, Ruikun Li, Xuebin Zheng, Yu Guang Wang, Junbin Gao
  • Computer Science
  • ArXiv
  • 2021
This paper tailors regularizers for graph data in terms of both feature and structure noises, where the objective function is efficiently solved with the alternating direction method of multipliers (ADMM). Expand
Heterogeneous Edge-Enhanced Graph Attention Network For Multi-Agent Trajectory Prediction
This work proposes a three-channel framework together with a novel Heterogeneous Edge-enhanced graph ATtention network (HEAT), which is able to deal with the heterogeneity of the target agents and traffic participants involved, and can realize simultaneous trajectory predictions for multiple agents under complex traffic situations. Expand
Edge: Enriching Knowledge Graph Embeddings with External Text
This work designs a graph alignment term in a shared embedding space between the original graph and augmented graph to enhance the embedding learning on the augmented graph, and regularize the locality relationship of target entity based on negative sampling. Expand
Graph Neural Networks as the Copula Mundi between Logic and Machine Learning: a Roadmap
This paper elicit a number of problems from the field of CL that may benefit from many graph-related problems where GNN has been proved effective, and exemplify the application of GNN to logic theories via an end-to-end toy example, to demonstrate the many intricacies hidden behind the technique. Expand
NF-GNN: Network Flow Graph Neural Networks for Malware Detection and Classification
This work first extracts flow graphs and subsequently classifies them using a novel edge feature-based graph neural network model, which support malware detection and classification in supervised and unsupervised settings and can boost detection performance by a significant margin. Expand
Self-Supervised Graph Representation Learning via Topology Transformations
We present the Topology Transformation Equivariant Representation learning, a general paradigm of self-supervised learning for node representations of graph data to enable the wide applicability ofExpand


Exploiting Edge Features for Graph Neural Networks
  • Liyu Gong, Qiang Cheng
  • Computer Science, Mathematics
  • 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2019
A new framework for a family of new graph neural network models that can more sufficiently exploit edge features, including those of undirected or multi-dimensional edges, and new models that are able to exploit a rich source of graph edge information are built. Expand
Graph Neural Networks: A Review of Methods and Applications
A detailed review over existing graph neural network models is provided, systematically categorize the applications, and four open problems for future research are proposed. Expand
Learning Discrete Structures for Graph Neural Networks
This work proposes to jointly learn the graph structure and the parameters of graph convolutional networks (GCNs) by approximately solving a bilevel program that learns a discrete probability distribution on the edges of the graph. Expand
CensNet: Convolution with Edge-Node Switching in Graph Neural Networks
This paper presents CensNet, Convolution with Edge-Node Switching graph neural network, for semi-supervised classification and regression in graph-structured data with both node and edge features, and proposes two novel graph convolution operations for feature propagation. Expand
A Comprehensive Survey on Graph Neural Networks
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns. Expand
Position-aware Graph Neural Networks
Position-aware Graph Neural Networks (P-GNNs) are proposed, a new class of GNNs for computing position-aware node embeddings that are inductive, scalable, and can incorporate node feature information. Expand
LINE: Large-scale Information Network Embedding
A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures. Expand
The Graph Neural Network Model
A new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains, and implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space. Expand
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of priorExpand
Inductive Representation Learning on Large Graphs
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks. Expand