• Corpus ID: 214802134

Graph Sequential Network for Reasoning over Sequences

  title={Graph Sequential Network for Reasoning over Sequences},
  author={Ming Tu and Jing Huang and Xiaodong He and Bowen Zhou},
Recently Graph Neural Network (GNN) has been applied successfully to various NLP tasks that require reasoning, such as multi-hop machine reading comprehension. In this paper, we consider a novel case where reasoning is needed over graphs built from sequences, i.e. graph nodes with sequence data. Existing GNN models fulfill this goal by first summarizing the node sequences into fixed-dimensional vectors, then applying GNN on these vectors. To avoid information loss inherent in the early… 
Graph neural network: Current state of Art, challenges and applications
This paper explains the graph neural networks, its area of applications and its day-to-day use in the authors' daily lives, and demonstrates the basic challenges encountered while implementing GNN.
The Graph Reasoning Approach Based on the Dynamic Knowledge Auxiliary for Complex Fact Verification
Experiments show that DKAR put forward in this study can be combined with specific and discriminative knowledge to guide the FV system to successfully overcome the knowledge-gap challenges and achieve improvement in FV tasks.
Stronger Transformers for Neural Multi-Hop Question Generation
This work introduces a series of strong transformer models for multi-hop question generation, including a graph-augmented transformer that leverages relations between entities in the text and shows that it can substantially outperform the state-of-the-art by 5 BLEU points using a standard transformer architecture.
Machine Reading Comprehension: The Role of Contextualized Language Models and Beyond
It is arrived at that 1) MRC boosts the progress from language processing to understanding; 2) the rapid improvement of MRC systems greatly benefits from the development of CLMs; 3) the theme of M RC is gradually moving from shallow text matching to cognitive reasoning.
Longformer: The Long-Document Transformer
Following prior work on long-sequence transformers, the Longformer is evaluated on character-level language modeling and achieves state-of-the-art results on text8 and enwik8 and pretrain Longformer and finetune it on a variety of downstream tasks.


Multi-hop Reading Comprehension across Multiple Documents by Reasoning over Heterogeneous Graphs
This paper introduces a heterogeneous graph with different types of nodes and edges, which is named as Heterogeneous Document-Entity (HDE) graph, which contains different granularity levels of information including candidates, documents and entities in specific document contexts.
Dynamically Fused Graph Network for Multi-hop Reasoning
D Dynamically Fused Graph Network is proposed, a novel method to answer those questions requiring multiple scattered evidence and reasoning over them, Inspired by human’s step-by-step reasoning behavior.
Question Answering by Reasoning Across Documents with Graph Convolutional Networks
A neural model which integrates and reasons relying on information spread within documents and across multiple documents is introduced, which achieves state-of-the-art results on a multi-document question answering dataset, WikiHop.
Modeling Relational Data with Graph Convolutional Networks
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.
How Powerful are Graph Neural Networks?
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction
An extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel is proposed, and a novel pruning strategy is applied to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold.
GEAR: Graph-based Evidence Aggregating and Reasoning for Fact Verification
A graph-based evidence aggregating and reasoning (GEAR) framework which enables information to transfer on a fully-connected evidence graph and then utilizes different aggregators to collect multi-evidence information is proposed.
Inductive Representation Learning on Large Graphs
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Graph Convolutional Networks for Text Classification
This work builds a single text graph for a corpus based on word co-occurrence and document word relations, then learns a Text Graph Convolutional Network (Text GCN) for the corpus, which jointly learns the embeddings for both words and documents as supervised by the known class labels for documents.