Boosting Graph Structure Learning with Dummy Nodes
@inproceedings{Liu2022BoostingGS, title={Boosting Graph Structure Learning with Dummy Nodes}, author={Xin Liu and Jiayang Cheng and Yangqiu Song and Xin Jiang}, booktitle={International Conference on Machine Learning}, year={2022} }
With the development of graph kernels and graph representation learning, many superior methods have been proposed to handle scalability and oversmoothing issues on graph structure learning. However, most of those strategies are designed based on practical experience rather than theoretical analysis. In this paper, we use a particular dummy node connecting to all existing vertices without affecting original vertex and edge properties. We further prove that such the dummy node can help build an…
3 Citations
Natural and Artificial Dynamics in Graphs: Concept, Progress, and Future
- Computer ScienceFrontiers in Big Data
- 2022
The definitions of natural dynamics and artificial dynamics in graphs are introduced, and the related works of natural andificial dynamics about how they boost the aforementioned graph research topics are discussed.
Predicting Protein-Ligand Binding Affinity with Equivariant Line Graph Network
- Computer ScienceArXiv
- 2022
Experimental results on two real datasets demonstrate the effectiveness of ELGN over several state-of-the-art baselines, and show that ELGN surpasses previous methods with better effectiveness and generalizability.
A multi-head pseudo nodes based spatial-temporal graph convolutional network for emotion perception from GAIT
- Computer ScienceNeurocomputing
- 2022
References
SHOWING 1-10 OF 44 REFERENCES
Hierarchical Graph Pooling with Structure Learning
- Computer ScienceAAAI 2020
- 2019
A novel graph pooling operator, called Hierarchical Graph Pooling with Structure Learning (HGP-SL), which can be integrated into various graph neural network architectures, and introduces a structure learning mechanism to learn a refined graph structure for the pooled graph at each layer.
Hierarchical Graph Representation Learning with Differentiable Pooling
- Computer ScienceNeurIPS
- 2018
DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion.
How Powerful are Graph Neural Networks?
- Computer ScienceICLR
- 2019
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Composition-based Multi-Relational Graph Convolutional Networks
- Computer ScienceICLR
- 2020
This paper proposes CompGCN, a novel Graph Convolutional framework which jointly embeds both nodes and relations in a relational graph and leverages a variety of entity-relation composition operations from Knowledge Graph Embedding techniques and scales with the number of relations.
Graph Convolutional Networks with Dual Message Passing for Subgraph Isomorphism Counting and Matching
- Computer ScienceAAAI
- 2022
It is proved that searching isomorphisms on the original graph is equivalent to searching on its dual graph and proposed dual message passing neural networks (DMPNNs) to enhance the substructure representation learning in an asynchronous way for subgraph isomorphism counting and matching as well as unsupervised node classification.
Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning
- Computer ScienceAAAI
- 2018
It is shown that the graph convolution of the GCN model is actually a special form of Laplacian smoothing, which is the key reason why GCNs work, but it also brings potential concerns of over-smoothing with many convolutional layers.
Heterogeneous Graph Transformer
- Computer ScienceWWW
- 2020
The proposed HGT model consistently outperforms all the state-of-the-art GNN baselines by 9–21 on various downstream tasks, and the heterogeneous mini-batch graph sampling algorithm—HGSampling—for efficient and scalable training.
Node2Seq: Towards Trainable Convolutions in Graph Neural Networks
- Computer ScienceArXiv
- 2021
Experimental results demonstrate the effectiveness of the proposed Node2Seq layer and show that the proposed adaptively non-local information learning can improve the performance of feature learning.
Inductive Representation Learning on Large Graphs
- Computer ScienceNIPS
- 2017
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Neural Subgraph Isomorphism Counting
- Computer ScienceKDD
- 2020
Experimental results show that learning based subgraph isomorphism counting can speed up the traditional algorithm, VF2, 10-1,000 times with acceptable errors and Domain adaptation based on fine-tuning also shows the usefulness of the approach in real-world applications.