Graphgen-redux: a Fast and Lightweight Recurrent Model for labeled Graph Generation

@article{Podda2021GraphgenreduxAF,
  title={Graphgen-redux: a Fast and Lightweight Recurrent Model for labeled Graph Generation},
  author={Marco Podda and Davide Bacciu},
  journal={2021 International Joint Conference on Neural Networks (IJCNN)},
  year={2021},
  pages={1-8}
}
  • Marco Podda, D. Bacciu
  • Published 18 July 2021
  • Computer Science
  • 2021 International Joint Conference on Neural Networks (IJCNN)
The problem of labeled graph generation is gaining attention in the Deep Learning community. The task is challenging due to the sparse and discrete nature of graph spaces. Several approaches have been proposed in the literature, most of which require to transform the graphs into sequences that encode their structure and labels and to learn the distribution of such sequences through an auto-regressive generative model. Among this family of approaches, we focus on the Graphgen model. The… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 37 REFERENCES
Edge-based sequential graph generation with recurrent neural networks
TLDR
This work proposes to cast the generative process of a graph into a sequential one, relying on a node ordering procedure, to design a novel generative model composed of two recurrent neural networks that learn to predict the edges of graphs. Expand
GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models
TLDR
The experiments show that GraphRNN significantly outperforms all baselines, learning to generate diverse graphs that match the structural characteristics of a target set, while also scaling to graphs 50 times larger than previous deep models. Expand
Learning Deep Generative Models of Graphs
TLDR
This work is the first and most general approach for learning generative models over arbitrary graphs, and opens new directions for moving away from restrictions of vector- and sequence-like knowledge representations, toward more expressive and flexible relational data structures. Expand
GRAM: Scalable Generative Models for Graphs with Graph Attention Mechanism
TLDR
This paper proposes GRAM, a generative model for graphs that is scalable in all three contexts, especially in training, and aims to achieve scalability by employing a novel graph attention mechanism, formulating the likelihood of graphs in a simple and general manner. Expand
A Systematic Survey on Deep Generative Models for Graph Generation
TLDR
An extensive overview of the literature in the field of deep generative models for graph generation is provided and two taxonomies of deep Generative Models for unconditional, and conditional graph generation respectively are proposed. Expand
GraphGen: A Scalable Approach to Domain-agnostic Labeled Graph Generation
TLDR
Extensive experiments on million-sized, real graph datasets show GraphGen to be 4 times faster on average than state-of-the-art techniques while being significantly better in quality across a comprehensive set of 11 different metrics. Expand
GraphGAN: Graph Representation Learning with Generative Adversarial Nets
TLDR
GraphGAN is proposed, an innovative graph representation learning framework unifying above two classes of methods, in which the generative model and discriminative model play a game-theoretical minimax game. Expand
Graphite: Iterative Generative Modeling of Graphs
TLDR
This work proposes Graphite, an algorithmic framework for unsupervised learning of representations over nodes in large graphs using deep latent variable generative models, parameterizes variational autoencoders (VAE) with graph neural networks, and uses a novel iterative graph refinement strategy inspired by low-rank approximations for decoding. Expand
Graph generation by sequential edge prediction
TLDR
A recurrent Deep Learning based model to generate graphs by learning to predict their ordered edge sequence is proposed, outperforming canonical graph generative models from graph theory, and reaching performances comparable to the current state of the art on graph generation. Expand
Learning Graphical State Transitions
TLDR
The Gated Graph Transformer Neural Network (GGTNN), an extension of GGS-NNs that uses graph-structured data as an intermediate representation that can learn to construct and modify graphs in sophisticated ways based on textual input, and also to use the graphs to produce a variety of outputs. Expand
...
1
2
3
4
...