Share This Author
A Comprehensive Survey on Graph Neural Networks
- Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, Philip S. Yu
- Computer ScienceIEEE Transactions on Neural Networks and Learning…
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.
Graph WaveNet for Deep Spatial-Temporal Graph Modeling
This paper proposes a novel graph neural network architecture, Graph WaveNet, for spatial-temporal graph modeling by developing a novel adaptive dependency matrix and learn it through node embedding, which can precisely capture the hidden spatial dependency in the data.
Connecting the Dots: Multivariate Time Series Forecasting with Graph Neural Networks
- Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, Xiaojun Chang, C. Zhang
- Computer ScienceKDD
- 24 May 2020
This paper proposes a general graph neural network framework designed specifically for multivariate time series data that outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets and achieves on-par performance with other approaches on two traffic datasets which provide extra structural information.
Personalized Federated Learning With a Graph
This paper proposes a novel structured federated learning (SFL) framework to learn both the global and personalized models simultaneously using client-wise relation graphs and clients' private data and casts SFL with graph into a novel optimization problem that can model the Client-wise complex relations and graph-based structural topology by a unified framework.
Heterogeneous Graph Attention Network for Small and Medium-Sized Enterprises Bankruptcy Prediction
A heterogeneous-attention-network-based model (HAT) is proposed to facilitate SMEs bankruptcy prediction using publicly-accessible data and has two major components: a heterogeneous neighborhood encoding layer and a triple attention output layer.
Beyond Low-pass Filtering: Graph Convolutional Networks with Automatic Filtering
- Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, Chengqi Zhang
- Computer ScienceIEEE Transactions on Knowledge and Data…
- 10 July 2021
While it is based on graph spectral theory, the proposed Automatic Graph Convolutional Networks is also localized in space and has a spatial form, and experimental results show that AutoGCN achieves signiﬁcant improvement over baseline methods which only work as low-pass ﬁlters.
Spatio-Temporal Joint Graph Convolutional Networks for Traffic Forecasting
- Chuanpan Zheng, Xiaoliang Fan, Shirui Pan, Zonghan Wu, Cheng Wang, Philip S. Yu
- Computer ScienceArXiv
- 25 November 2021
A Spatio-Temporal Joint Graph Convolutional Networks (STJGCN) for traffic forecasting over several time steps ahead on a road network is proposed, which is computationally efficient and outperforms 11 state-of-the-art baseline methods.
TraverseNet: Unifying Space and Time in Message Passing
- Zonghan Wu, Da Zheng, Shirui Pan, Quan Gan, Guodong Long, G. Karypis
- Computer ScienceIEEE transactions on neural networks and learning…
- 25 August 2021
This article aims to unify spatial dependency and temporal dependency in a non-Euclidean space while capturing the inner spatial-temporal dependencies for traffic data by proposing TraverseNet, a novel spatial- Temporal graph neural network, viewing space and time as an inseparable whole.
Personalized Federated Learning With Structure
A novel structured federated learning framework to simultaneously learn the global model and personalized model using each client’s local relations with others and its private dataset and the effectiveness of the proposed method has been demonstrated in varying degrees of data non-iid settings.
ConTIG: Continuous Representation Learning on Temporal Interaction Graphs
A two-module framework named ConTIG, a continuous representation method that captures the continuous dynamic evolution of node embedding trajectories and introduces a self-attention mechanism to predict future node embeddings by aggregating historical temporal interaction information.