# Permutation Invariant Graph Generation via Score-Based Generative Modeling

@inproceedings{Niu2020PermutationIG, title={Permutation Invariant Graph Generation via Score-Based Generative Modeling}, author={Chenhao Niu and Yang Song and Jiaming Song and Shengjia Zhao and Aditya Grover and Stefano Ermon}, booktitle={AISTATS}, year={2020} }

Learning generative models for graph-structured data is challenging because graphs are discrete, combinatorial, and the underlying data distribution is invariant to the ordering of nodes. However, most of the existing generative models for graphs are not invariant to the chosen ordering, which might lead to an undesirable bias in the learned distribution. To address this difficulty, we propose a permutation invariant approach to modeling graphs, using the recent framework of score-based…

## Figures, Tables, and Topics from this paper

## 24 Citations

Permutation-Invariant Variational Autoencoder for Graph-Level Representation Learning

- Computer ScienceArXiv
- 2021

This work proposes a permutation-invariant variational autoencoder for graph structured data that indirectly learns to match the node order of input and output graph, without imposing a particular node order or performing expensive graph matching.

A Systematic Survey on Deep Generative Models for Graph Generation

- Computer Science, MathematicsArXiv
- 2020

An extensive overview of the literature in the field of deep generative models for graph generation is provided and two taxonomies of deep Generative Models for unconditional, and conditional graph generation respectively are proposed.

Permutation Equivariant Generative Adversarial Networks for Graphs

- Computer ScienceArXiv
- 2021

3G-GAN is proposed, a 3-stages model relying on GANs and equivariant functions, which ensures the ordering invariance in graph generative modeling.

Adversarial Stein Training for Graph Energy Models

- Computer ScienceArXiv
- 2021

This work uses an energy-based model based on multi-channel graph neural networks to learn permutation invariant unnormalized density functions on graphs via minimizing adversarial stein discrepancy and finds that this approach achieves competitive results on graph generation compared to benchmark models.

Evaluation Metrics for Graph Generative Models: Problems, Pitfalls, and Practical Solutions

- Computer Science, MathematicsArXiv
- 2021

The desirable criteria for comparison metrics are enumerated, the development of such metrics are discussed, and a comparison of their respective expressive power is provided.

Partition and Code: learning how to compress graphs

- Computer Science, MathematicsArXiv
- 2021

This work aims to establish the necessary principles a lossless graph compression method should follow to approach the entropy storage lower bound, and formulate the compressor as a probabilistic model that can be learned from data and generalise to unseen instances.

Structured Denoising Diffusion Models in Discrete State-Spaces

- Computer ScienceArXiv
- 2021

D3PMs are diffusionlike generative models for discrete data that generalize the multinomial diffusion model of Hoogeboom et al. by going beyond corruption processes with uniform transition probabilities, and it is shown that the choice of transition matrix is an important design decision that leads to improved results in image and text domains.

Score-based Generative Modeling in Latent Space

- Mathematics, Computer ScienceArXiv
- 2021

The Latent Score-based Generative Model (LSGM) is proposed, a novel approach that trains SGMs in a latent space, relying on the variational autoencoder framework, and achieves state-of-the-art likelihood on the binarized OMNIGLOT dataset.

TD-GEN: Graph Generation With Tree Decomposition

- Computer Science, MathematicsArXiv
- 2021

TD-GEN, a graph generation framework based on tree decomposition, is proposed, and a reduced upper bound on the maximum number of decisions needed for graph generation is introduced to compare the performance of models based on likelihood.

Graph Generation with Energy-Based Models

- 2020

We present a set of novel, energy-based models built on top of graph neural networks (GNNEBMs) to estimate the unnormalized density of a distribution of graphs. GNN-EBMs can generate graphs…

## References

SHOWING 1-10 OF 45 REFERENCES

Learning Deep Generative Models of Graphs

- Computer Science, MathematicsICLR 2018
- 2018

This work is the first and most general approach for learning generative models over arbitrary graphs, and opens new directions for moving away from restrictions of vector- and sequence-like knowledge representations, toward more expressive and flexible relational data structures.

GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models

- Computer ScienceICML
- 2018

The experiments show that GraphRNN significantly outperforms all baselines, learning to generate diverse graphs that match the structural characteristics of a target set, while also scaling to graphs 50 times larger than previous deep models.

Graphite: Iterative Generative Modeling of Graphs

- Computer Science, MathematicsICML
- 2019

This work proposes Graphite, an algorithmic framework for unsupervised learning of representations over nodes in large graphs using deep latent variable generative models, parameterizes variational autoencoders (VAE) with graph neural networks, and uses a novel iterative graph refinement strategy inspired by low-rank approximations for decoding.

Hierarchical Graph Representation Learning with Differentiable Pooling

- Computer Science, MathematicsNeurIPS
- 2018

DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion.

Representation Learning on Graphs with Jumping Knowledge Networks

- Computer Science, MathematicsICML
- 2018

This work explores an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation in graphs.

Universal Invariant and Equivariant Graph Neural Networks

- Computer Science, MathematicsNeurIPS
- 2019

The results show that a GNN defined by a single set of parameters can approximate uniformly well a function defined on graphs of varying size.

How Powerful are Graph Neural Networks?

- Computer Science, MathematicsICLR
- 2019

This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

Graph Normalizing Flows

- Computer Science, MathematicsNeurIPS
- 2019

This work introduces graph normalizing flows: a new, reversible graph neural network model for prediction and generation, which is permutation-invariant, generating entire graphs with a single feed-forward pass, and achieves competitive results with the state of the art auto-regressive models, while being better suited to parallel computing architectures.

Kronecker Graphs: An Approach to Modeling Networks

- Mathematics, Computer ScienceJ. Mach. Learn. Res.
- 2010

Kronecker graphs naturally obey common network properties and it is rigorously proved that they do so, and KRONFIT, a fast and scalable algorithm for fitting the Kronecker graph generation model to large real networks, is presented.

Inductive Representation Learning on Large Graphs

- Computer Science, MathematicsNIPS
- 2017

GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.