• Corpus ID: 230770044

Node2Seq: Towards Trainable Convolutions in Graph Neural Networks

@article{Yuan2021Node2SeqTT,
  title={Node2Seq: Towards Trainable Convolutions in Graph Neural Networks},
  author={Hao Yuan and Shuiwang Ji},
  journal={ArXiv},
  year={2021},
  volume={abs/2101.01849}
}
Investigating graph feature learning becomes essentially important with the emergence of graph data in many real-world applications. Several graph neural network approaches are proposed for node feature learning and they generally follow a neighboring information aggregation scheme to learn node features. While great performance has been achieved, the weights learning for different neighboring nodes is still less explored. In this work, we propose a novel graph network layer, known as Node2Seq… 

Figures and Tables from this paper

Boosting Graph Structure Learning with Dummy Nodes
TLDR
It is proved that such the dummy node can help build an efficient monomorphic edge-to-vertex transform and an epimorphic inverse to recover the original graph back and indicates that adding dummy nodes can preserve local and global structures for better graph representation learning.
Graph Pointer Neural Networks
TLDR
A pointer network is used to select the most relevant nodes from a large amount of multi-hop neighborhoods, which constructs an ordered sequence according to the relationship with the central node, and a pointer-network-based ranker in GPNN is joint-optimized with other parts in an end-to-end manner.
On Explainability of Graph Neural Networks via Subgraph Explorations
TLDR
This work represents the first attempt to explain GNNs via identifying subgraphs explicitly, and proposes to use Shapley values as a measure of subgraph importance, to make the tree search more effective and expedite computations.
Graph Learning with 1D Convolutions on Random Walks
TLDR
It is demonstrated empirically that CRAWL matches or outperforms state-of-the-art GNN architectures across a multitude of benchmark datasets for classification and regression on graphs.

References

SHOWING 1-10 OF 43 REFERENCES
Line Graph Neural Networks for Link Prediction
TLDR
This work proposes to seek a radically different and novel path by making use of the line graphs in graph theory, which can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
Self-Attention Graph Pooling
TLDR
This paper proposes a graph pooling method based on self-attention using graph convolution, which achieves superior graph classification performance on the benchmark datasets using a reasonable number of parameters.
Large-Scale Learnable Graph Convolutional Networks
TLDR
The proposed LGCL automatically selects a fixed number of neighboring nodes for each feature based on value ranking in order to transform graph data into grid-like structures in 1-D format, thereby enabling the use of regular convolutional operations on generic graphs.
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
Graph Representation Learning via Hard and Channel-Wise Attention Networks
TLDR
Compared to GAO, hGAO improves performance and saves computational cost by only attending to important nodes, and Efficiency comparison shows that the cGAO leads to dramatic savings in computational resources, making them applicable to large graphs.
A Multi-Scale Approach for Graph Link Prediction
TLDR
This work proposes a novel node aggregation method that can transform the enclosing subgraph into different scales and preserve the relationship between two target nodes for link prediction, and demonstrates that the proposed method outperforms the state-of-the-art methods by employing multi-scale graphs without additional parameters.
Representation Learning on Graphs with Jumping Knowledge Networks
TLDR
This work explores an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation in graphs.
A Comprehensive Survey on Graph Neural Networks
TLDR
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.
Non-Local Graph Neural Networks
TLDR
This work proposes a simple yet effective non-local aggregation framework with an efficient attention-guided sorting for GNNs that significantly outperform previous state-of-the-art methods on seven benchmark datasets of disassortative graphs, in terms of both model performance and efficiency.
Graph U-Nets
TLDR
This work considers the problem of representation learning for graph data and proposes attention-based pooling and unpooling layers, which can better capture graph topology information, and develops an encoder-decoder model known as the graph U-Nets.
...
...