N-ary Relation Extraction using Graph-State LSTM

@inproceedings{Song2018NaryRE,
  title={N-ary Relation Extraction using Graph-State LSTM},
  author={Linfeng Song and Yue Zhang and Zhiguo Wang and Daniel Gildea},
  booktitle={EMNLP},
  year={2018}
}
Cross-sentence n-ary relation extraction detects relations among n entities across multiple sentences. Typical methods formulate an input as a document graph, integrating various intra-sentential and inter-sentential dependencies. The current state-of-the-art method splits the input graph into two DAGs, adopting a DAG-structured LSTM for each. Though being able to model rich linguistic knowledge by leveraging graph edges, important information can be lost in the splitting procedure. We propose… 

Figures and Tables from this paper

Inter-sentence Relation Extraction with Document-level Graph Convolutional Neural Network
TLDR
A novel inter-sentence relation extraction model that builds a labelled edge graph convolutional neural network model on a document-level graph using multi-instance learning with bi-affine pairwise scoring to predict the relation of an entity pair.
An External Knowledge Enhanced Graph-based Neural Network for Sentence Ordering
TLDR
A novel and flexible external knowledge enhanced graph-based neural network for sentence ordering, where various kinds of relations are exploited to make the graph representation more expressive and less noisy.
Document-level Relation Extraction with Dual-tier Heterogeneous Graph
TLDR
A novel graph-based model with Dual-tier Heterogeneous Graph (DHG) for document-level RE composed of a structure modeling layer followed by a relation reasoning layer, capable of not only capturing both the sequential and structural information of documents but also mixing them together to benefit for multi-hop reasoning and final decision-making.
Double Graph Based Reasoning for Document-level Relation Extraction
TLDR
This paper proposes Graph Aggregation-and-Inference Network (GAIN) featuring double graphs, based on which GAIN first constructs a heterogeneous mention-level graph (hMG) to model complex interaction among different mentions across the document and proposes a novel path reasoning mechanism to infer relations between entities.
Modeling Multi-Granularity Hierarchical Features for Relation Extraction
TLDR
Experimental results show that the method significantly outperforms existing state-of-the-art models that even use external knowledge, and it is shown that effective structured features can be attained even without external knowledge.
BERT-GT: Cross-sentence n-ary relation extraction with BERT and Graph Transformer
MOTIVATION A biomedical relation statement is commonly expressed in multiple sentences and consists of many concepts, including gene, disease, chemical, and mutation. To automatically extract
Edge Features Enhanced Graph Attention Network for Relation Extraction
TLDR
This work proposes an extension of the graph attention network for relation extraction task, which makes use of the whole dependency tree and its edge features and can be viewed as a soft-pruning approach strategy that automatically learns the relationship between different nodes in the full dependency tree.
Graph Convolution over Multiple Dependency Sub-graphs for Relation Extraction
TLDR
A contextualised graph convolution network over multiple dependency-based sub-graphs for relation extraction achieves superior performance over the existing GCN-based models achieving state-of-the-art performance on cross-sentence n-ary relation extraction datasets and SemEval 2010 Task 8 sentence-level relation extraction dataset.
Cross-Sentence N-ary Relation Extraction using Lower-Arity Universal Schemas
TLDR
This paper proposes a novel approach to cross-sentence n-ary relation extraction based on universal schemas to learn relation representations of lower-arityfacts that result from decomposing higher-arity facts.
...
...

References

SHOWING 1-10 OF 33 REFERENCES
Cross-Sentence N-ary Relation Extraction with Graph LSTMs
TLDR
A general relation extraction framework based on graph long short-term memory networks (graph LSTMs) that can be easily extended to cross-sentence n-ary relation extraction is explored, demonstrating its effectiveness with both conventional supervised learning and distant supervision.
End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures
TLDR
A novel end-to-end neural model to extract entities and relations between them and compares favorably to the state-of-the-art CNN based model (in F1-score) on nominal relation classification (SemEval-2010 Task 8).
Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling
TLDR
A version of graph convolutional networks (GCNs), a recent class of neural networks operating on graphs, suited to model syntactic dependency graphs, is proposed, observing that GCN layers are complementary to LSTM ones.
A Graph-to-Sequence Model for AMR-to-Text Generation
TLDR
This work introduces a neural graph-to-sequence model, using a novel LSTM structure for directly encoding graph-level semantics, and shows superior results to existing methods in the literature.
Incremental Joint Extraction of Entity Mentions and Relations
TLDR
An incremental joint framework to simultaneously extract entity mentions and relations using structured perceptron with efficient beam-search is presented, which significantly outperforms a strong pipelined baseline, which attains better performance than the best-reported end-to-end system.
Distant Supervision for Relation Extraction beyond the Sentence Boundary
TLDR
This paper proposes the first approach for applying distant supervision to cross-sentence relation extraction with a graph representation that can incorporate both standard dependencies and discourse relations, thus providing a unifying way to model relations within and across sentences.
Exploiting Rich Syntactic Information for Semantic Parsing with Graph-to-Sequence Model
TLDR
This paper first proposes to use the syntactic graph to represent three types of syntactic information, i.e., word order, dependency and constituency features; then employs a graph-to-sequence model to encode the Syntactic graph and decode a logical form.
Sentence-State LSTM for Text Representation
TLDR
This work investigates an alternative LSTM structure for encoding text, which consists of a parallel state for each word, and shows that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers.
Improved Relation Extraction with Feature-Rich Compositional Embedding Models
TLDR
A Feature-rich Compositional Embedding Model (FCM) for relation extraction that is expressive, generalizes to new domains, and is easy-to-implement that outperforms both previous compositional models and traditional feature rich models.
Extracting Relations Within and Across Sentences
TLDR
It was found that the structured features used for intrasentential relation extraction can be easily adapted for the inter-sentential case and provides comparable performance.
...
...