• Corpus ID: 240354586

On the Power of Edge Independent Graph Models

@article{Chanpuriya2021OnTP,
  title={On the Power of Edge Independent Graph Models},
  author={Sudhanshu Chanpuriya and Cameron Musco and Konstantinos Sotiropoulos and Charalampos E. Tsourakakis},
  journal={ArXiv},
  year={2021},
  volume={abs/2111.00048}
}
Why do many modern neural-network-based graph generative models fail to repro-duce typical real-world network characteristics, such as high triangle density? In this work we study the limitations of edge independent random graph models , in which each edge is added to the graph independently with some probability. Such models include both the classic Erdös-Rényi and stochastic block models, as well as modern generative models such as NetGAN, variational graph autoencoders, and CELL. We prove… 

Figures and Tables from this paper

Spectral embedding and the latent geometry of multipartite networks

The purpose of this article is to develop bespoke statistical methodology for the case when the graph is multipartite, to recover node representations in their intrinsic rather than ambient dimension, proving uniform consistency under a low-rank, inhomogeneous random graph model.

Implications of sparsity and high triangle density for graph representation learning

This work shows that sparse graphs containing many triangles can be reproduced using an in-dimensional inner product model, where the node representations lie on a low-dimensional manifold, evidence against the common perception that triangles imply community structure.

References

SHOWING 1-10 OF 45 REFERENCES

GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models

The experiments show that GraphRNN significantly outperforms all baselines, learning to generate diverse graphs that match the structural characteristics of a target set, while also scaling to graphs 50 times larger than previous deep models.

Learning Deep Generative Models of Graphs

This work is the first and most general approach for learning generative models over arbitrary graphs, and opens new directions for moving away from restrictions of vector- and sequence-like knowledge representations, toward more expressive and flexible relational data structures.

NetGAN without GAN: From Random Walks to Low-Rank Approximations

This paper investigates the implicit bias of NetGAN and finds that the root of its generalization properties does not lie in the GAN architecture, but in an inconspicuous low-rank approximation of the logits random walk transition matrix.

NetGAN: Generating Graphs via Random Walks

The proposed model is based on a stochastic neural network that generates discrete output samples and is trained using the Wasserstein GAN objective, and is able to produce graphs that exhibit the well-known network patterns without explicitly specifying them in the model definition.

Efficient Graph Generation with Graph Recurrent Attention Networks

A new family of efficient and expressive deep generative models of graphs, called Graph Recurrent Attention Networks (GRANs), which better captures the auto-regressive conditioning between the already-generated and to-be-generated parts of the graph using Graph Neural Networks (GNNs) with attention.

Node Embeddings and Exact Low-Rank Representations of Complex Networks

This work proves that a minor relaxation of their model can generate sparse graphs with high triangle density and gives a simple algorithm based on logistic principal component analysis (LPCA) that succeeds in finding such exact embeddings.

The impossibility of low-rank representations for triangle-rich complex networks

It is mathematically proved that low-dimensional embeddings cannot generate graphs with both low average degree and large clustering coefficients, and any embedding that can successfully create these two properties must have a rank that is nearly linear in the number of vertices.

Graphite: Iterative Generative Modeling of Graphs

This work proposes Graphite, an algorithmic framework for unsupervised learning of representations over nodes in large graphs using deep latent variable generative models, parameterizes variational autoencoders (VAE) with graph neural networks, and uses a novel iterative graph refinement strategy inspired by low-rank approximations for decoding.

Kronecker Graphs: An Approach to Modeling Networks

Kronecker graphs naturally obey common network properties and it is rigorously proved that they do so, and KRONFIT, a fast and scalable algorithm for fitting the Kronecker graph generation model to large real networks, is presented.

MolGAN: An implicit generative model for small molecular graphs

MolGAN is introduced, an implicit, likelihood-free generative model for small molecular graphs that circumvents the need for expensive graph matching procedures or node ordering heuris-tics of previous likelihood-based methods.