Corpus ID: 14249137

Variational Graph Auto-Encoders

@article{Kipf2016VariationalGA,
  title={Variational Graph Auto-Encoders},
  author={Thomas Kipf and Max Welling},
  journal={ArXiv},
  year={2016},
  volume={abs/1611.07308}
}
We introduce the variational graph auto-encoder (VGAE), a framework for unsupervised learning on graph-structured data based on the variational auto-encoder (VAE. [...] Key Method We demonstrate this model using a graph convolutional network (GCN) encoder and a simple inner product decoder. Our model achieves competitive results on a link prediction task in citation networks. In contrast to most existing models for unsupervised learning on graph-structured data and link prediction, our model can naturally…Expand
Variational Graph Normalized AutoEncoders
TLDR
This paper proposes a novel Variational Graph Normalized AutoEncoder (VGNAE) that utilize L2-normalization to derive better embeddings for isolated nodes and shows that the VGNAEs outperform the existing state-of-the-art models for link prediction tasks. Expand
Graph Embedding For Link Prediction Using Residual Variational Graph Autoencoders
TLDR
A novel graph embedding method called Residual Variational Graph Autoencoder (RVGAE), which boosts variational graph autoencoding's performance utilizing residual connections, is proposed. Expand
Multi-Task Graph Autoencoders
  • P. Tran
  • Computer Science, Mathematics
  • ArXiv
  • 2018
TLDR
This work presents a new autoencoder architecture capable of learning a joint representation of local graph structure and available node features for the simultaneous multi-task learning of unsupervised link prediction and semi-supervised node classification. Expand
Dynamic Joint Variational Graph Autoencoders
TLDR
Dyn-VGAE provides a joint learning framework for computing temporal representations of all graph snapshots simultaneously and can learn both local structures and temporal evolutionary patterns in a dynamic network. Expand
Semi-Implicit Graph Variational Auto-Encoders
TLDR
SIG-VAE employs a hierarchical variational framework to enable neighboring node sharing for better generative modeling of graph dependency structure, together with a Bernoulli-Poisson link decoder to provide a more flexible generative graph model. Expand
Correlated Variational Auto-Encoders
TLDR
An approximation by average of a set of tractable lower bounds over all maximal acyclic subgraphs of the undirected correlation graph is developed to address the intractability introduced by the correlated prior. Expand
Effective Decoding in Graph Auto-Encoder using Triadic Closure
TLDR
This paper incorporates the well-known triadic closure property, which is exhibited in many real-world networks, to the triad decoder, which considers and predicts the three edges involved in a local triad together in any graph-based auto-encoder. Expand
Learning to Make Predictions on Graphs with Autoencoders
  • P. Tran
  • Computer Science, Mathematics
  • 2018 IEEE 5th International Conference on Data Science and Advanced Analytics (DSAA)
  • 2018
TLDR
This work presents a novel autoencoder architecture capable of learning a joint representation of both local graph structure and available node features for the multi-task learning of link prediction and node classification. Expand
Learning Graphon Autoencoders for Generative Graph Modeling
TLDR
This work develops an efficient learning algorithm to learn the encoder and the decoder, minimizing the Wasserstein distance between the model and data distributions and provides a new paradigm to represent and generate graphs, which has good generalizability and transferability. Expand
Variational Graph Recurrent Neural Networks
TLDR
A novel hierarchical variational model is developed that introduces additional latent random variables to jointly model the hidden states of a graph recurrent neural network (GRNN) to capture both topology and node attribute changes in dynamic graphs. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 14 REFERENCES
Semi-Supervised Classification with Graph Convolutional Networks
TLDR
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin. Expand
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods. Expand
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced. Expand
DeepWalk: online learning of social representations
TLDR
DeepWalk is an online learning algorithm which builds useful incremental results, and is trivially parallelizable, which make it suitable for a broad class of real world applications such as network classification, and anomaly detection. Expand
LINE: Large-scale Information Network Embedding
TLDR
A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures. Expand
Stochastic Backpropagation and Approximate Inference in Deep Generative Models
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference andExpand
Adam: A Method for Stochastic Optimization
TLDR
This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework. Expand
Collective Classification in Network Data
TLDR
This article introduces four of the most widely used inference algorithms for classifying networked data and empirically compare them on both synthetic and real-world data. Expand
Understanding the difficulty of training deep feedforward neural networks
TLDR
The objective here is to understand better why standard gradient descent from random initialization is doing so poorly with deep neural networks, to better understand these recent relative successes and help design better algorithms in the future. Expand
Leveraging social media networks for classification
TLDR
The proposed framework, SocioDim, first extracts social dimensions based on the network structure to accurately capture prominent interaction patterns between actors, then learns a discriminative classifier to select relevant social dimensions. Expand
...
1
2
...