Deep Graph Generators: A Survey

@article{Faez2021DeepGG,
  title={Deep Graph Generators: A Survey},
  author={Faezeh Faez and Yassaman Ommi and Mahdieh Soleymani Baghshah and Hamid R. Rabiee},
  journal={IEEE Access},
  year={2021},
  volume={9},
  pages={106675-106702}
}
Deep generative models have achieved great success in areas such as image, speech, and natural language processing in the past few years. Thanks to the advances in graph-based deep learning, and in particular graph representation learning, deep graph generation methods have recently emerged with new applications ranging from discovering novel molecular structures to modeling social networks. This paper conducts a comprehensive survey on deep learning-based graph generation approaches and… 
CCGG: A Deep Autoregressive Model for Class-Conditional Graph Generation
TLDR
This paper addresses the problem of class-conditional graph generation that uses class labels as generation constraints by introducing the Class Conditioned Graph Generator (CCGG), a model that outperforms existing conditional graph generation methods on various datasets.
DIG: A Turnkey Library for Diving into Graph Deep Learning Research
TLDR
DIG: Dive into Graphs is a research-oriented library that integrates unified and extensible implementations of common graph deep learning algorithms for several advanced tasks, and provides unified implementations of data interfaces, common algorithms, and evaluation metrics.
GraphTune: A Learning-based Graph Generative Model with Tunable Structural Features
TLDR
The model called GraphTune enables to tune a value of any structural feature of generated graphs using Long Short Term Memory (LSTM) and Conditional Variational AutoEncoder (CVAE) and evaluations show that GraphTunes enables to clearly tune avalue of a global-level structural feature compared to the conventional models.
Gransformer: Transformer-based Graph Generation
TLDR
Gransformer, an al- gorithm for generating graphs based on the Transformer, is proposed and Experimental results have shown that the proposed method performs comparatively to these methods, including recurrent models and graph convolutional networks.
Controllable Data Generation by Deep Learning: A Review
TLDR
This article provides a systematic review of this promising research area, commonly known as controllable deep data generation, and formally defined, a taxonomy on various techniques is proposed and the evaluation metrics in this specific domain are summarized.
Molecule Generation for Drug Design: a Graph Learning Perspective
TLDR
This survey provides an overview of the state-of-the-art molecule design and discovery aiding methods whose methodology involves (deep) graph learning, and proposes to categorize these methods into three groups: all at once, fragment-based and node-by-node.
G2GT: Retrosynthesis Prediction with Graph to Graph Attention Neural Network and Self-Training
TLDR
A new graph-to-graph transformation model, G2GT, is proposed, in which the graph encoder and graph decoder are built upon the standard transformer structure, and it is shown that self-training, a powerful data augmentation method that utilizes unlabeled molecule data, can significantly improve the model’s performance.
Molecular design in drug discovery: a comprehensive review of deep generative models.
TLDR
In this study, deep generative models are reviewed to witness the recent advances of de novo molecular design for drug discovery and divided into two categories based on molecular representations in silico.
Deconvolutional Networks on Graph Data
TLDR
This paper proposes Graph Deconvolutional Network (GDN) and motivates the design of GDN via a combination of inverse filters in spectral domain and de-noising layers in wavelet domain, as the inverse operation results in a high frequency amplifier and may amplify the noise.
Learn Locally, Correct Globally: A Distributed Algorithm for Training Graph Neural Networks
TLDR
A communication-efficient distributed GNN training technique named Learn Locally, Correct Globally (LLCG), which can significantly improve the efficiency without hurting the performance and rigorously analyzes the convergence of distributed methods with periodic model averaging for training GNNs.
...
...

References

SHOWING 1-10 OF 169 REFERENCES
Deep Learning on Graphs: A Survey
TLDR
This survey comprehensively review the different types of deep learning methods on graphs by dividing the existing methods into five categories based on their model architectures and training strategies: graph recurrent neural networks, graph convolutional networks,graph autoencoders, graph reinforcement learning, and graph adversarial methods.
A Comprehensive Survey on Graph Neural Networks
TLDR
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.
GraphVAE: Towards Generation of Small Graphs Using Variational Autoencoders
TLDR
This work proposes to sidestep hurdles associated with linearization of discrete structures by having a decoder output a probabilistic fully-connected graph of a predefined maximum size directly at once by formulated as a variational autoencoder.
Adversarial Attacks on Graph Neural Networks via Meta Learning
TLDR
The core principle is to use meta-gradients to solve the bilevel problem underlying training-time attacks on graph neural networks for node classification that perturb the discrete graph structure, essentially treating the graph as a hyperparameter to optimize.
Adversarial Attacks on Graph Neural Networks via Meta Learning
TLDR
The core principle is to use meta-gradients to solve the bilevel problem underlying training-time attacks on graph neural networks for node classification that perturb the discrete graph structure, essentially treating the graph as a hyperparameter to optimize.
Conditional Structure Generation through Graph Variational Generative Adversarial Nets
TLDR
This work forms the novel problem of conditional structure generation, and proposes a novel unified model of graph variational generative adversarial nets (CondGen) to handle the intrinsic challenges of flexible context-structure conditioning and permutation-invariant generation.
NeVAE: A Deep Generative Model for Molecular Graphs
TLDR
A novel variational autoencoder for molecular graphs is proposed, whose encoder and decoder are specially designed to account for the above properties by means of several technical innovations.
Adversarial Attack and Defense on Graph Data: A Survey
TLDR
This work systemically organize the considered works based on the features of each topic and provides a unified formulation for adversarialLearning on graph data which covers most adversarial learning studies on graph.
Adversarial Attacks on Neural Networks for Graph Data
TLDR
This work introduces the first study of adversarial attacks on attributed graphs, specifically focusing on models exploiting ideas of graph convolutions, and generates adversarial perturbations targeting the node's features and the graph structure, taking the dependencies between instances in account.
Generating Triples with Adversarial Networks for Scene Graph Construction
TLDR
A method, based on recent advancements in Generative Adversarial Networks, to overcome deficiencies in scene graph generation by first generating small subgraphs, each describing a single statement about a scene from a specific region of the input image chosen using an attention mechanism.
...
...