Corpus ID: 219966724

Connecting Graph Convolutional Networks and Graph-Regularized PCA

@article{Zhao2020ConnectingGC,
  title={Connecting Graph Convolutional Networks and Graph-Regularized PCA},
  author={Lingxiao Zhao and L. Akoglu},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.12294}
}
Graph convolution operator of the GCN model is originally motivated from a localized first-order approximation of spectral graph convolutions.This work stands on a different view; establishing a connection between graph convolution and graph-regularized PCA. Based on this connection, GCN architecture, shaped by stacking graph convolution layers, shares a close relationship with stacking graph-regularized PCA (GPCA). We empirically demonstrate that the unsupervised embeddings by GPCA paired with… Expand
1 Citations
Structural attention network for graph
We present a structural attention network (SAN) for graph modeling, which is a novel approach to learn node representations based on graph attention networks (GATs), with the introduction of twoExpand

References

SHOWING 1-10 OF 22 REFERENCES
Semi-Supervised Classification with Graph Convolutional Networks
TLDR
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin. Expand
Simplifying Graph Convolutional Networks
TLDR
This paper successively removes nonlinearities and collapsing weight matrices between consecutive layers, and theoretically analyze the resulting linear model and show that it corresponds to a fixed low-pass filter followed by a linear classifier. Expand
Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning
TLDR
It is shown that the graph convolution of the GCN model is actually a special form of Laplacian smoothing, which is the key reason why GCNs work, but it also brings potential concerns of over-smoothing with many convolutional layers. Expand
Deep Graph Infomax
TLDR
Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups. Expand
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of priorExpand
Revisiting Graph Neural Networks: All We Have is Low-Pass Filters
TLDR
The results indicate that graph neural networks only perform low-pass filtering on feature vectors and do not have the non-linear manifold learning property, and some insights on GCN-based graph neural network design are proposed. Expand
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs. Expand
On the Equivalence between Positional Node Embeddings and Structural Graph Representations
TLDR
This work provides the first unifying theoretical framework for node embeddings and structural graph representations, bridging methods like matrix factorization and graph neural networks, and introduces new practical guidelines to generating and using nodeembeddings, which further augments standard operating procedures used today. Expand
PairNorm: Tackling Oversmoothing in GNNs
TLDR
PairNorm is a novel normalization layer that is based on a careful analysis of the graph convolution operator, which prevents all node embeddings from becoming too similar and significantly boosts performance for a new problem setting that benefits from deeper GNNs. Expand
Graph-Laplacian PCA: Closed-Form Solution and Robustness
TLDR
A graph-Laplacian PCA (gLPCA) to learn a low dimensional representation of X that incorporates graph structures encoded in W that is capable to remove corruptions and shows promising results on image reconstruction and significant improvement on clustering and classification. Expand
...
1
2
3
...