• Corpus ID: 220683995

NetGAN without GAN: From Random Walks to Low-Rank Approximations

@inproceedings{Rendsburg2020NetGANWG,
  title={NetGAN without GAN: From Random Walks to Low-Rank Approximations},
  author={Luca Rendsburg and Holger Heidrich and Ulrike von Luxburg},
  booktitle={International Conference on Machine Learning},
  year={2020}
}
A graph generative model takes a graph as input and is supposed to generate new graphs that “look like” the input graph. While most classical models focus on few, hand-selected graph statistics and are too simplistic to reproduce real-world graphs, NetGAN recently emerged as an attractive alternative: by training a GAN to learn the random walk distribution of the input graph, the algorithm is able to reproduce a large number of important network patterns simultaneously, without explicitly… 

On the Power of Edge Independent Graph Models

It is proved that subject to a bounded overlap condition, which ensures that the model does not simply memorize a single graph, edge independent models are inherently limited in their ability to generate graphs with high triangle and other subgraph densities.

Efficient Learning-based Community-Preserving Graph Generation

This paper proposes a novel community-preserving generative adversarial network (CPGAN) for effective and efficient (scalable) graph simulation and demonstrates that CPGAN can achieve a good trade-off between efficiency and graph simulation quality for real-life graph simulation compared with state-of-the-art baselines.

AgraSSt: Approximate Graph Stein Statistics for Interpretable Assessment of Implicit Graph Generators

A novel statistical procedure, coined AgraSSt, which can be used to determine whether a learned graph generating process is capable of generating graphs which resemble a given input graph and gives theoretical guarantees for a broad class of random graph models.

Exact Representation of Sparse Networks with Symmetric Nonnegative Embeddings

A new bound for the LPCA model is proved in terms of arboricity rather than max degree; this greatly increases the bound’s applicability to many sparse real-world networks.

GraphDCA - a Framework for Node Distribution Comparison in Real and Synthetic Graphs

We argue that when comparing two graphs, the distribution of node structural features is more informative than global graph statistics which are often used in practice, especially to evaluate graph

An Interpretable Graph Generative Model with Heterophily

This work proposes the first edge-independent graph generative model that is expressive enough to capture heterophily, produces nonnegative embeddings, which allow link predictions to be interpreted in terms of communities, and optimizes effectively on realworld graphs with gradient descent on a cross-entropy loss.

Continual Graph Learning: A Survey

This survey introduces the basic concepts of CGL and highlights two unique challenges brought by graphs, and reviews and categorizes recent state-of-the-art approaches, analyzing their strategies to tackle the unique challenges in CGL.

W2SAT: Learning to generate SAT instances from Weighted Literal Incidence Graphs

This paper proposes W2SAT, a framework to generate SAT formulas by learning intrinsic structures and properties from given real-world/industrial instances in an implicit fashion, and introduces a novel SAT representation called Weighted Literal Incidence Graph (WLIG), which exhibits strong representation ability and generalizability against existing counterparts, and can be generated via a specialized learning-based graph generative model.

On RKHS Choices for Assessing Graph Generators via Kernel Stein Statistics

The power performance and the computational runtime of the test in dynamic graph regimes, including both dense and sparse graph regimes are investigated, and experimental results on kernel performance for model assessment tasks are shown.

Deep Denoising of Raw Biomedical Knowledge Graph From COVID-19 Literature, LitCovid, and Pubtator: Framework Development and Validation

Preliminary findings showed the proposed framework achieved promising results for removing noise during data preprocessing of the biomedical knowledge graph, potentially improving the performance of downstream applications by providing cleaner data.

References

SHOWING 1-10 OF 24 REFERENCES

NetGAN: Generating Graphs via Random Walks

The proposed model is based on a stochastic neural network that generates discrete output samples and is trained using the Wasserstein GAN objective, and is able to produce graphs that exhibit the well-known network patterns without explicitly specifying them in the model definition.

Generative Adversarial Nets

We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a

Adam: A Method for Stochastic Optimization

This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.

Spectral Graph Forge: Graph Generation Targeting Modularity

The Spectral Graph Forge outperforms state-of-the-art techniques in terms of accuracy in targeting the modularity and randomness of the realizations, while also preserving other local structural properties and node attributes.

Random Walks on Graphs: a Survey

Dedicated to the marvelous random walk of Paul Erd} os through universities, c ontinents, and mathematics Various aspects of the theory of random walks on graphs are surveyed. In particular,

A Critical Point for Random Graphs with a Given Degree Sequence

It is shown that if Σ i(i - 2)λi > 0, then such graphs almost surely have a giant component, while if λ0, λ1… which sum to 1, then almost surely all components in such graphs are small.

Wasserstein Generative Adversarial Networks

This work introduces a new algorithm named WGAN, an alternative to traditional GAN training that can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches.

A global geometric framework for nonlinear dimensionality reduction.

An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.

Long Short-Term Memory

A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.

Collective Classification in Network Data

This article introduces four of the most widely used inference algorithms for classifying networked data and empirically compare them on both synthetic and real-world data.