Stochastic Graph Recurrent Neural Network
@article{Yan2020StochasticGR, title={Stochastic Graph Recurrent Neural Network}, author={Tijin Yan and Hongwei Zhang and Zirui Li and Yuanqing Xia}, journal={ArXiv}, year={2020}, volume={abs/2009.00538} }
Representation learning over graph structure data has been widely studied due to its wide application prospects. However, previous methods mainly focus on static graphs while many real-world graphs evolve over time. Modeling such evolution is important for predicting properties of unseen networks. To resolve this challenge, we propose SGRNN, a novel neural architecture that applies stochastic latent variables to simultaneously capture the evolution in node attributes and topology. Specifically…
Figures and Tables from this paper
3 Citations
Revisiting Graph Convolutional Network on Semi-Supervised Node Classification from an Optimization Perspective
- Computer ScienceArXiv
- 2020
A universal theoretical framework of GCN is established from an optimization perspective and a novel convolutional kernel named GCN+ is derived which has lower parameter amount while relieving the over-smoothing inherently.
Graph neural networks in TensorFlow-Keras with RaggedTensor representation (kgcnn)
- Computer ScienceSoftw. Impacts
- 2021
Implementing graph neural networks with TensorFlow-Keras
- Computer ScienceArXiv
- 2021
The Keras Graph Convolutional Neural Network Python package kgcnn is developed that provides a set of Keras layers for graph networks which focus on a transparent tensor structure passed between layers and an ease-of-use mindset.
References
SHOWING 1-10 OF 43 REFERENCES
Variational Graph Recurrent Neural Networks
- Computer ScienceNeurIPS
- 2019
A novel hierarchical variational model is developed that introduces additional latent random variables to jointly model the hidden states of a graph recurrent neural network (GRNN) to capture both topology and node attribute changes in dynamic graphs.
dyngraph2vec: Capturing Network Dynamics using Dynamic Graph Representation Learning
- Computer ScienceKnowl. Based Syst.
- 2020
DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks
- Computer ScienceWSDM
- 2020
DySAT is a novel neural architecture that learns node representations to capture dynamic graph structural evolution and ablation study validates the effectiveness of jointly modeling structural and temporal self-attention.
A Deep Learning Approach to Link Prediction in Dynamic Networks
- Computer ScienceSDM
- 2014
A novel deep learning framework, i.e., Conditional Temporal Restricted Boltzmann Machine (ctRBM), which predicts links based on individual transition variance as well as influence introduced by local neighbors is proposed, which outperforms existing algorithms in link inference on dynamic networks.
Learning to Represent the Evolution of Dynamic Graphs with Recurrent Models
- Computer ScienceWWW
- 2019
This paper proposes an unsupervised representation learning architecture for dynamic graphs, designed to learn both the topological and temporal features of the graphs that evolve over time and demonstrates that the approach is capable of learning the representation of a dynamic graph through time by applying the embeddings to dynamic graph classification using a real world dataset of animal behaviour.
DynGEM: Deep Embedding Method for Dynamic Graphs
- Computer ScienceArXiv
- 2018
This work presents an efficient algorithm DynGEM, based on recent advances in deep autoencoders for graph embeddings, that can handle growing dynamic graphs, and has better running time than using static embedding methods on each snapshot of a dynamic graph.
Restricted Boltzmann Machine-Based Approaches for Link Prediction in Dynamic Networks
- Computer ScienceIEEE Access
- 2018
A novel framework that incorporates a deep learning method, i.e., temporal restricted Boltzmann machine, and a machine learning approach, ie.
Structured Sequence Modeling with Graph Convolutional Recurrent Networks
- Computer ScienceICONIP
- 2018
The proposed model combines convolutional neural networks on graphs to identify spatial structures and RNN to find dynamic patterns in data structured by an arbitrary graph.