• Corpus ID: 4590511

Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks

@article{Xu2018Graph2SeqGT,
  title={Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks},
  author={Kun Xu and Lingfei Wu and Zhiguo Wang and Yansong Feng and Vadim Sheinin},
  journal={ArXiv},
  year={2018},
  volume={abs/1804.00823}
}
The celebrated Sequence to Sequence learning (Seq2Seq) technique and its numerous variants achieve excellent performance on many tasks. However, many machine learning tasks have inputs naturally represented as graphs; existing Seq2Seq models face a significant challenge in achieving accurate conversion from graph form to the appropriate sequence. To address this challenge, we introduce a novel general end-to-end graph-to-sequence neural encoder-decoder model that maps an input graph to a… 

Figures and Tables from this paper

Graph Transformer for Graph-to-Sequence Learning
TLDR
A new model, known as Graph Transformer, is proposed that uses explicit relation encoding and allows direct communication between two distant nodes and provides a more efficient way for global graph structure modeling.
Graph-to-Tree Neural Networks for Learning Structured Input-Output Translation with Applications to Semantic Parsing and Math Word Problem
TLDR
This paper presents a novel Graph-to-Tree Neural Networks, namely Graph2Tree consisting of a graph encoder and a hierarchical tree decoder, that encodes an augmented graph-structured input and decodes a tree- Structured output.
Deep Learning on Graphs for Natural Language Processing
TLDR
This tutorial will cover relevant and interesting topics on applying deep learning on graph techniques to NLP, including automatic graph construction for NLP), graph representation learning for N LP, advanced GNN based models (e.g., graph2seq, graph2tree, and graph2graph) for NLS, and the applications of GNNs in various NLP tasks.
Iterative Deep Graph Learning for Graph Neural Networks: Better and Robust Node Embeddings
TLDR
This paper proposes an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL), for jointly and iteratively learning graph structure and graph embedding and proposes a scalable version of IDGL, namely IDGL-ANCH, which significantly reduces the time and space complexity of ID GL without compromising the performance.
Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation
TLDR
This model consists of a Graph2Seq generator with a novel Bidirectional Gated Graph Neural Network based encoder to embed the passage, and a hybrid evaluator with a mixed objective combining both cross-entropy and RL losses to ensure the generation of syntactically and semantically valid text.
Heterogeneous Graph Transformer for Graph-to-Sequence Learning
TLDR
This paper proposes the Heterogeneous Graph Transformer to independently model the different relations in the individual subgraphs of the original graph, including direct relations, indirect relations and multiple possible relations between nodes.
Compositionality-Aware Graph2Seq Learning
TLDR
This study adopts the multi-level attention pooling (MLAP) architecture, that can aggregate graph representations from multiple levels of information localities and demonstrates that the model having the MLAP architecture outperform the previous state-of-the-art model with more than seven times fewer parameters than it1.
Next Location Prediction with a Graph Convolutional Network Based on a Seq2seq Framework
TLDR
A neural network-based method which can capture spatial-temporal dependence to predict the next location of a person using a graph convolutional network based on a seq2seq framework to capture the location topology information and temporal dependence, respectively.
Deep Graph Translation
TLDR
A novel graph-translation-generative-adversarial-nets (GT-GAN) model that transforms the source graphs into their target output graphs and significantly outperforms other baseline methods in terms of both effectiveness and scalability.
...
...

References

SHOWING 1-10 OF 69 REFERENCES
Graph-to-Sequence Learning using Gated Graph Neural Networks
TLDR
This work proposes a new model that encodes the full structural information contained in the graph, couples the recently proposed Gated Graph Neural Networks with an input transformation that allows nodes and edges to have their own hidden representations, while tackling the parameter explosion problem present in previous work.
Gated Graph Sequence Neural Networks
TLDR
This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.
Sequence to Sequence Learning with Neural Networks
TLDR
This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling
TLDR
Enhanced with importance sampling, FastGCN not only is efficient for training but also generalizes well for inference, and is orders of magnitude more efficient while predictions remain comparably accurate.
Incorporating Copying Mechanism in Sequence-to-Sequence Learning
TLDR
This paper incorporates copying into neural network-based Seq2Seq learning and proposes a new model called CopyNet with encoder-decoder structure which can nicely integrate the regular way of word generation in the decoder with the new copying mechanism which can choose sub-sequences in the input sequence and put them at proper places in the output sequence.
Graph Embedding Techniques, Applications, and Performance: A Survey
GraphVAE: Towards Generation of Small Graphs Using Variational Autoencoders
TLDR
This work proposes to sidestep hurdles associated with linearization of discrete structures by having a decoder output a probabilistic fully-connected graph of a predefined maximum size directly at once by formulated as a variational autoencoder.
Representation Learning on Graphs: Methods and Applications
TLDR
A conceptual review of key advancements in this area of representation learning on graphs, including matrix factorization-based methods, random-walk based algorithms, and graph neural networks are provided.
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.
...
...