• Corpus ID: 238856809

sMGC: A Complex-Valued Graph Convolutional Network via Magnetic Laplacian for Directed Graphs

@article{Zhang2021sMGCAC,
  title={sMGC: A Complex-Valued Graph Convolutional Network via Magnetic Laplacian for Directed Graphs},
  author={Jie Zhang and Bo Hui and P. Harn and Min-Te Sun and Wei-Shinn Ku},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.07570}
}
  • Jie Zhang, Bo Hui, +2 authors Wei-Shinn Ku
  • Published 14 October 2021
  • Computer Science, Engineering
  • ArXiv
Recent advancements in Graph Neural Networks have led to state-of-the-art performance on representation learning of graphs for node classification. However, the majority of existing works process directed graphs by symmetrization, which may cause loss of directional information. In this paper, we propose the magnetic Laplacian that preserves edge directionality by encoding it into complex phase as a deformation of the combinatorial Laplacian. In addition, we design an AutoRegressive Moving… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 48 REFERENCES
MagNet: A Magnetic Neural Network for Directed Graphs
TLDR
MagNet is proposed, a spectral GNN for directed graphs based on a complex Hermitian matrix known as the magnetic Laplacian, which exceeds other spectral Gnns on directed graph node classification and link prediction tasks for a variety of datasets and exceeds commonly used spatial GNNs on a majority of such.
Simple Spectral Graph Convolution
TLDR
This paper uses a modified Markov Diffusion Kernel to derive a variant of GCN called Simple Spectral Graph Convolution (S2GC), and spectral analysis shows that the simple spectral graph convolution used in S2GC is a trade-off of low and high-pass filter bands which capture the global and local contexts of each node.
Geom-GCN: Geometric Graph Convolutional Networks
TLDR
The proposed aggregation scheme is permutation-invariant and consists of three modules, node embedding, structural neighborhood, and bi-level aggregation, and an implementation of the scheme in graph convolutional networks, termed Geom-GCN, to perform transductive learning on graphs.
Revisiting Graph Neural Networks: All We Have is Low-Pass Filters
TLDR
The results indicate that graph neural networks only perform low-pass filtering on feature vectors and do not have the non-linear manifold learning property, and some insights on GCN-based graph neural network design are proposed.
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
TLDR
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.
Analyzing the Expressive Power of Graph Neural Networks in a Spectral Perspective
TLDR
It is argued that a spectral analysis of GNNs behavior can provide a complementary point of view to go one step further in the understanding of Gnns, and theoretically demonstrate some equivalence of the graph convolution process regardless of whether it is designed in the spatial or the spectral domain.
Diffusion Improves Graph Learning
TLDR
This work removes the restriction of using only the direct neighbors by introducing a powerful, yet spatially localized graph convolution: Graph diffusion convolution (GDC), which leverages generalized graph diffusion and alleviates the problem of noisy and often arbitrarily defined edges in real graphs.
Interpreting and Unifying Graph Neural Networks with An Optimization Framework
TLDR
A surprising connection is established between different propagation mechanisms with a unified optimization problem, showing that despite the proliferation of various GNNs, in fact, their proposed propagation mechanisms are the optimal solution optimizing a feature fitting function over a wide class of graph kernels with a graph regularization term.
Simplifying Graph Convolutional Networks
TLDR
This paper successively removes nonlinearities and collapsing weight matrices between consecutive layers, and theoretically analyze the resulting linear model and show that it corresponds to a fixed low-pass filter followed by a linear classifier.
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
...
1
2
3
4
5
...