• Corpus ID: 210701879

Graph Attentional Autoencoder for Anticancer Hyperfood Prediction

  title={Graph Attentional Autoencoder for Anticancer Hyperfood Prediction},
  author={Guadalupe Gonzalez and Shunwang Gong and Ivan Laponogov and Kirill A. Veselkov and Michael M. Bronstein},
Recent research efforts have shown the possibility to discover anticancer drug-like molecules in food from their effect on protein-protein interaction networks, opening a potential pathway to disease-beating diet design. We formulate this task as a graph classification problem on which graph neural networks (GNNs) have achieved state-of-the-art results. However, GNNs are difficult to train on sparse low-dimensional features according to our empirical evidence. Here, we present graph augmented… 

Figures and Tables from this paper

Certified Robustness of Graph Classification against Topology Attack with Randomized Smoothing
It is proven that the resulting graph classification model would output the same prediction for a graph under l0 bounded adversarial perturbation, as well as evaluating the effectiveness of the approach under graph convolutional network (GCN) based multi-classgraph classification model.
Interaction data are identifiable even across long periods of time
It is demonstrated here how people’s interaction behavior is stable over long periods of time and can be used to identify individuals in anonymous datasets, and the results suggest that people with well-balanced interaction graphs are more identifiable.


HyperFoods: Machine intelligent mapping of cancer-beating molecules in foods
A network-based machine learning platform is introduced to identify putative food-based cancer-beating molecules that have been identified through their molecular biological network commonality with clinically approved anti-cancer therapies.
Graph Capsule Convolutional Neural Networks
The proposed Graph Capsule Network can significantly outperforms both the existing state-of-art deep learning methods and graph kernels on graph classification benchmark datasets and is designed to solve especially graph classification problem which current GCNN models find challenging.
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
Representation Learning on Graphs: Methods and Applications
A conceptual review of key advancements in this area of representation learning on graphs, including matrix factorization-based methods, random-walk based algorithms, and graph neural networks are provided.
Representation Learning on Graphs with Jumping Knowledge Networks
This work explores an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation in graphs.
Neural Message Passing for Quantum Chemistry
Using MPNNs, state of the art results on an important molecular property prediction benchmark are demonstrated and it is believed future work should focus on datasets with larger molecules or more accurate ground truth labels.
How Powerful are Graph Neural Networks?
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Semi-Supervised Classification with Graph Convolutional Networks
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Convolutional Networks on Graphs for Learning Molecular Fingerprints
A convolutional neural network that operates directly on graphs that allows end-to-end learning of prediction pipelines whose inputs are graphs of arbitrary size and shape is introduced.
Geometric Deep Learning: Going beyond Euclidean data
Deep neural networks are used for solving a broad range of problems from computer vision, natural-language processing, and audio analysis where the invariances of these structures are built into networks used to model them.