• Corpus ID: 43924638

Constrained Graph Variational Autoencoders for Molecule Design

@article{Liu2018ConstrainedGV,
  title={Constrained Graph Variational Autoencoders for Molecule Design},
  author={Qi Liu and Miltiadis Allamanis and Marc Brockschmidt and Alexander L. Gaunt},
  journal={ArXiv},
  year={2018},
  volume={abs/1805.09076}
}
Graphs are ubiquitous data structures for representing interactions between entities. [] Key Method Our decoder assumes a sequential ordering of graph extension steps and we discuss and analyze design choices that mitigate the potential downsides of this linearization. Experiments compare our approach with a wide range of baselines on the molecule generation task and show that our method is more successful at matching the statistics of the original dataset on semantically important metrics. Furthermore, we…

Figures from this paper

NeVAE: A Deep Generative Model for Molecular Graphs
TLDR
A novel variational autoencoder for molecular graphs is proposed, whose encoder and decoder are specially designed to account for the above properties by means of several technical innovations.
3DMolNet: A Generative Network for Molecular Structures
TLDR
This work proposes a new approach to efficiently generate molecular structures that are not restricted to a fixed size or composition, based on the variational autoencoder which learns a translation-, rotation-, and permutation-invariant low-dimensional representation of molecules.
Learning Multimodal Graph-to-Graph Translation for Molecular Optimization
TLDR
Diverse output distributions in the model are explicitly realized by low-dimensional latent vectors that modulate the translation process and show that the model outperforms previous state-of-the-art baselines.
Physics-Constrained Predictive Molecular Latent Space Discovery with Graph Scattering Variational Autoencoder
TLDR
This work presents a quantitative assessment of the latent space in terms of its predictive ability for organic molecules in the QM9 dataset and considers a Bayesian formalism to account for the limited size training data set.
D-VAE: A Variational Autoencoder for Directed Acyclic Graphs
TLDR
This paper proposes an asynchronous message passing scheme that allows encoding the computations on DAGs, rather than using existing simultaneous message passing schemes to encode local graph structures, and proposes a novel DAG variational autoencoder (D-VAE).
Adversarial Learned Molecular Graph Inference and Generation
TLDR
This work proposes ALMGIG, a likelihood-free adversarial learning framework for inference and de novo molecule generation that avoids explicitly computing a reconstruction loss, and extends generative adversarial networks by including an adversarial cycle-consistency loss to implicitly enforce the reconstruction property.
Auto-decoding Graphs
TLDR
The presented model outperforms the state of the art by a factor of 1.5 in mean accuracy and average rank across at least three different graph statistics, with a 2x speedup during inference.
Multi-resolution Autoregressive Graph-to-Graph Translation for Molecules
TLDR
This work substantially extends prior state-of-the-art on graphto-graph translation methods for molecular optimization and realizes coherent multi-resolution representations by interweaving trees over substructures with the atom-level encoding of the original molecular graph.
Analysis of training and seed bias in small molecules generated with a conditional graph-based variational autoencoder - Insights for practical AI-driven molecule generation
TLDR
This work analyzes the impact of seed and training bias on the output of an activity-conditioned graph-based variational autoencoder (VAE) and uncover relationships between noise, molecular seeds, and training set selection across a range of latent-space sampling procedures, providing important insights for practical AI-driven molecule generation.
Graph Deconvolutional Generation
TLDR
This work focuses on the modern equivalent of the Erdos-Renyi random graph model: the graph variational autoencoder (GVAE), and improves this class of models by building a message passing neural network into GVAE's encoder and decoder.
...
...

References

SHOWING 1-10 OF 37 REFERENCES
Designing Random Graph Models Using Variational Autoencoders With Applications to Chemical Design
TLDR
Experiments reveal that the proposed variational autoencoder for graphs is able to learn and mimic the generative process of several well-known random graph models and can be used to create new molecules more effectively than several state of the art methods.
Learning Deep Generative Models of Graphs
TLDR
This work is the first and most general approach for learning generative models over arbitrary graphs, and opens new directions for moving away from restrictions of vector- and sequence-like knowledge representations, toward more expressive and flexible relational data structures.
GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models
TLDR
The experiments show that GraphRNN significantly outperforms all baselines, learning to generate diverse graphs that match the structural characteristics of a target set, while also scaling to graphs 50 times larger than previous deep models.
GraphRNN: A Deep Generative Model for Graphs
TLDR
The experiments show that GraphRNN significantly outperforms all baselines, learning to generate diverse graphs that match the structural characteristics of a target set, while also scaling to graphs 50 times larger than previous deep models.
Gated Graph Sequence Neural Networks
TLDR
This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.
Learning Graphical State Transitions
TLDR
The Gated Graph Transformer Neural Network (GGTNN), an extension of GGS-NNs that uses graph-structured data as an intermediate representation that can learn to construct and modify graphs in sophisticated ways based on textual input, and also to use the graphs to produce a variety of outputs.
Tackling Over-pruning in Variational Autoencoders
TLDR
The epitomic variational autoencoder (eVAE) is proposed, which makes efficient use of model capacity and generalizes better than VAE and helps prevent inactive units since each group is pressured to explain the data.
Neural Message Passing for Quantum Chemistry
TLDR
Using MPNNs, state of the art results on an important molecular property prediction benchmark are demonstrated and it is believed future work should focus on datasets with larger molecules or more accurate ground truth labels.
Grammar Variational Autoencoder
TLDR
Surprisingly, it is shown that not only does the model more often generate valid outputs, it also learns a more coherent latent space in which nearby points decode to similar discrete outputs.
Automatic Chemical Design Using a Data-Driven Continuous Representation of Molecules
We report a method to convert discrete representations of molecules to and from a multidimensional continuous representation. This model allows us to generate new molecules for efficient exploration
...
...