Corpus ID: 214802134

Graph Sequential Network for Reasoning over Sequences

@article{Tu2020GraphSN,
  title={Graph Sequential Network for Reasoning over Sequences},
  author={Ming Tu and Jinke Huang and Xiaodong He and Bowen Zhou},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.02001}
}
Recently Graph Neural Network (GNN) has been applied successfully to various NLP tasks that require reasoning, such as multi-hop machine reading comprehension. In this paper, we consider a novel case where reasoning is needed over graphs built from sequences, i.e. graph nodes with sequence data. Existing GNN models fulfill this goal by first summarizing the node sequences into fixed-dimensional vectors, then applying GNN on these vectors. To avoid information loss inherent in the early… Expand
Graph neural network: Current state of Art, challenges and applications
TLDR
This paper explains the graph neural networks, its area of applications and its day-to-day use in the authors' daily lives, and demonstrates the basic challenges encountered while implementing GNN. Expand
The Graph Reasoning Approach Based on the Dynamic Knowledge Auxiliary for Complex Fact Verification
TLDR
Experiments show that DKAR put forward in this study can be combined with specific and discriminative knowledge to guide the FV system to successfully overcome the knowledge-gap challenges and achieve improvement in FV tasks. Expand
Stronger Transformers for Neural Multi-Hop Question Generation
TLDR
This work introduces a series of strong transformer models for multi-hop question generation, including a graph-augmented transformer that leverages relations between entities in the text and shows that it can substantially outperform the state-of-the-art by 5 BLEU points using a standard transformer architecture. Expand
Machine Reading Comprehension: The Role of Contextualized Language Models and Beyond
TLDR
It is arrived at that 1) MRC boosts the progress from language processing to understanding; 2) the rapid improvement of MRC systems greatly benefits from the development of CLMs; 3) the theme of M RC is gradually moving from shallow text matching to cognitive reasoning. Expand
Longformer: The Long-Document Transformer
TLDR
Following prior work on long-sequence transformers, the Longformer is evaluated on character-level language modeling and achieves state-of-the-art results on text8 and enwik8 and pretrain Longformer and finetune it on a variety of downstream tasks. Expand

References

SHOWING 1-10 OF 34 REFERENCES
Multi-hop Reading Comprehension across Multiple Documents by Reasoning over Heterogeneous Graphs
TLDR
This paper introduces a heterogeneous graph with different types of nodes and edges, which is named as Heterogeneous Document-Entity (HDE) graph, which contains different granularity levels of information including candidates, documents and entities in specific document contexts. Expand
Dynamically Fused Graph Network for Multi-hop Reasoning
TLDR
D Dynamically Fused Graph Network is proposed, a novel method to answer those questions requiring multiple scattered evidence and reasoning over them, Inspired by human’s step-by-step reasoning behavior. Expand
Question Answering by Reasoning Across Documents with Graph Convolutional Networks
TLDR
A neural model which integrates and reasons relying on information spread within documents and across multiple documents is introduced, which achieves state-of-the-art results on a multi-document question answering dataset, WikiHop. Expand
Modeling Relational Data with Graph Convolutional Networks
TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline. Expand
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs. Expand
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of priorExpand
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction
TLDR
An extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel is proposed, and a novel pruning strategy is applied to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold. Expand
GEAR: Graph-based Evidence Aggregating and Reasoning for Fact Verification
TLDR
A graph-based evidence aggregating and reasoning (GEAR) framework which enables information to transfer on a fully-connected evidence graph and then utilizes different aggregators to collect multi-evidence information is proposed. Expand
Inductive Representation Learning on Large Graphs
TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks. Expand
Graph Convolutional Networks for Text Classification
TLDR
This work builds a single text graph for a corpus based on word co-occurrence and document word relations, then learns a Text Graph Convolutional Network (Text GCN) for the corpus, which jointly learns the embeddings for both words and documents as supervised by the known class labels for documents. Expand
...
1
2
3
4
...