• Corpus ID: 221397582

Stochastic Graph Recurrent Neural Network

  title={Stochastic Graph Recurrent Neural Network},
  author={Tijin Yan and Hongwei Zhang and Zirui Li and Yuanqing Xia},
Representation learning over graph structure data has been widely studied due to its wide application prospects. However, previous methods mainly focus on static graphs while many real-world graphs evolve over time. Modeling such evolution is important for predicting properties of unseen networks. To resolve this challenge, we propose SGRNN, a novel neural architecture that applies stochastic latent variables to simultaneously capture the evolution in node attributes and topology. Specifically… 

Figures and Tables from this paper

Revisiting Graph Convolutional Network on Semi-Supervised Node Classification from an Optimization Perspective
A universal theoretical framework of GCN is established from an optimization perspective and a novel convolutional kernel named GCN+ is derived which has lower parameter amount while relieving the over-smoothing inherently.
Implementing graph neural networks with TensorFlow-Keras
The Keras Graph Convolutional Neural Network Python package kgcnn is developed that provides a set of Keras layers for graph networks which focus on a transparent tensor structure passed between layers and an ease-of-use mindset.


Variational Graph Recurrent Neural Networks
A novel hierarchical variational model is developed that introduces additional latent random variables to jointly model the hidden states of a graph recurrent neural network (GRNN) to capture both topology and node attribute changes in dynamic graphs.
dyngraph2vec: Capturing Network Dynamics using Dynamic Graph Representation Learning
DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks
DySAT is a novel neural architecture that learns node representations to capture dynamic graph structural evolution and ablation study validates the effectiveness of jointly modeling structural and temporal self-attention.
A Deep Learning Approach to Link Prediction in Dynamic Networks
A novel deep learning framework, i.e., Conditional Temporal Restricted Boltzmann Machine (ctRBM), which predicts links based on individual transition variance as well as influence introduced by local neighbors is proposed, which outperforms existing algorithms in link inference on dynamic networks.
Learning to Represent the Evolution of Dynamic Graphs with Recurrent Models
This paper proposes an unsupervised representation learning architecture for dynamic graphs, designed to learn both the topological and temporal features of the graphs that evolve over time and demonstrates that the approach is capable of learning the representation of a dynamic graph through time by applying the embeddings to dynamic graph classification using a real world dataset of animal behaviour.
DynGEM: Deep Embedding Method for Dynamic Graphs
This work presents an efficient algorithm DynGEM, based on recent advances in deep autoencoders for graph embeddings, that can handle growing dynamic graphs, and has better running time than using static embedding methods on each snapshot of a dynamic graph.
Restricted Boltzmann Machine-Based Approaches for Link Prediction in Dynamic Networks
A novel framework that incorporates a deep learning method, i.e., temporal restricted Boltzmann machine, and a machine learning approach, ie.
Dynamic Graph Convolutional Networks
Structured Sequence Modeling with Graph Convolutional Recurrent Networks
The proposed model combines convolutional neural networks on graphs to identify spatial structures and RNN to find dynamic patterns in data structured by an arbitrary graph.