Dimensionwise Separable 2-D Graph Convolution for Unsupervised and Semi-Supervised Learning on Graphs

@article{Li2021DimensionwiseS2,
  title={Dimensionwise Separable 2-D Graph Convolution for Unsupervised and Semi-Supervised Learning on Graphs},
  author={Qimai Li and Xiaotong Zhang and Han Liu and Quanyu Dai and Xiao-Ming Wu},
  journal={Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery \& Data Mining},
  year={2021}
}
  • Qimai Li, Xiaotong Zhang, +2 authors Xiao-Ming Wu
  • Published 2021
  • Computer Science, Mathematics
  • Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining
Graph convolutional neural networks (GCN) have been the model of choice for graph representation learning, which is mainly due to the effective design of graph convolution that computes the representation of a node by aggregating those of its neighbors. However, existing GCN variants commonly use 1-D graph convolution that solely operates on the object link graph without exploring informative relational information among object attributes. This significantly limits their modeling capability and… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 89 REFERENCES
Inductive Representation Learning on Large Graphs
TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks. Expand
MGAE: Marginalized Graph Autoencoder for Graph Clustering
TLDR
A marginalized graph convolutional network is proposed to corrupt network node content, allowing node content to interact with network features, and marginalizes the corrupted features in a graph autoencoder context to learn graph feature representations. Expand
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
TLDR
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs. Expand
Variational Graph Auto-Encoders
TLDR
The variational graph auto-encoder (VGAE) is introduced, a framework for unsupervised learning on graph-structured data based on the variational auto- Encoder (VAE) that can naturally incorporate node features, which significantly improves predictive performance on a number of benchmark datasets. Expand
Kipf and MaxWelling
  • 2017
Revisiting Semi-Supervised Learning with Graph Embeddings
TLDR
On a large and diverse set of benchmark tasks, including text classification, distantly supervised entity extraction, and entity classification, the proposed semi-supervised learning framework shows improved performance over many of the existing models. Expand
Label Efficient Semi-Supervised Learning via Graph Filtering
TLDR
This paper proposes a graph filtering framework that injects graph similarity into data features by taking them as signals on the graph and applying a low-pass graph filter to extract useful data representations for classification, where label efficiency can be achieved by conveniently adjusting the strength of the graph filter. Expand
  • 2021
  • 2020
Graph Random Neural Networks for Semi-Supervised Learning on Graphs
TLDR
In GRAND, a simple yet effective framework for semi-supervised learning on graphs that first design a random propagation strategy to perform graph data augmentation, then leverages consistency regularization to optimize the prediction consistency of unlabeled nodes across different data augmentations. Expand
...
1
2
3
4
5
...