• Corpus ID: 29159360

Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions

@inproceedings{Ji2017DistantSF,
  title={Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions},
  author={Guoliang Ji and Kang Liu and Shizhu He and Jun Zhao},
  booktitle={AAAI},
  year={2017}
}
Distant supervision for relation extraction is an efficient method to scale relation extraction to very large corpora which contains thousands of relations. [...] Key Method The background knowledge not only provides more information for predicting relations, but also brings better entity representations for the attention module. We conduct three experiments on a widely used dataset and the experimental results show that our approach outperforms all the baseline systems significantly.Expand
Distant Supervision for Relation Extraction with Hierarchical Attention and Entity Descriptions
TLDR
A novel hierarchical attention model is proposed to select valid instances and capture vital semantic information in them to combat the noise introduced by distant supervision and adequately extract latent and helpful background information.
Distant supervision for relation extraction with hierarchical selective attention
TLDR
This work proposes a novel hierarchical selective attention network for relation extraction under distant supervision that first selects most relevant sentences by taking coarse sentence-level attention on all sentences of an entity pair and then employs word-level Attention to construct sentence representations and fine sentence- level attention to aggregate these sentence representations.
Knowledge-Aware and Retrieval-Based Models for Distantly Supervised Relation Extraction
TLDR
A knowledge-aware attention model is proposed, which can leverage the semantic knowledge in the KB to select the valid sentences and formalize distantly supervised RE as relation retrieval instead of relation classification to leverage the Semantic knowledge further.
Global Relation Embedding for Relation Extraction
TLDR
It is shown that the learned textual relation embedding can be used to augment existing relation extraction models and significantly improve their performance, most remarkably, for the top 1,000 relational facts discovered by the best existing model.
Distant supervision for neural relation extraction integrated with word attention and property features
TLDR
A word-level attention mechanism to distinguish the importance of each individual word in a sentence, increasing the attention weights for those critical words is developed and Experimental results show that the model outperforms previous state-of-the-art baselines.
Relation Extraction Using Supervision from Topic Knowledge of Relation Labels
TLDR
This paper mines the topic knowledge of a relation to explicitly represent the semantics of this relation, and proposes a deep matching network to precisely model the semantic similarity between a sentence-relation pair.
Improving Distantly-Supervised Relation Extraction with Joint Label Embedding
TLDR
A novel multi-layer attention-based model to improve relation extraction with joint label embedding that makes full use of both structural information from Knowledge Graphs and textual information from entity descriptions to learn label embeddings through gating integration while avoiding the imposed noise with an attention mechanism.
Distant Supervision for Relation Extraction via Retrieval-based Neural Networks
TLDR
A retrieval-based method based on piecewise convolutional neural networks (called RPCNN) for DS relation extraction, which calculate relation retrieval scores between input features and relation features before classification.
A Hybrid Graph Model for Distant Supervision Relation Extraction
TLDR
This work proposes a novel hybrid graph model, which can incorporate heterogeneous background information in a unified framework, such as entity types and human-constructed triples, which outperforms the state-of-the-art methods significantly in various evaluation metrics.
Distant Supervision for Relation Extraction with Neural Instance Selector
TLDR
A method called Neural Instance Selector (NIS) is proposed to solve the problems of distant supervision and noisy labels and can effectively filter noisy data and achieve better performance than several baseline methods.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 28 REFERENCES
Neural Relation Extraction with Selective Attention over Instances
TLDR
A sentence-level attention-based model for relation extraction that employs convolutional neural networks to embed the semantics of sentences and dynamically reduce the weights of those noisy instances.
End-to-End Relation Extraction Using Distant Supervision from External Semantic Repositories
In this paper, we extend distant supervision (DS) based on Wikipedia for Relation Extraction (RE) by considering (i) relations defined in external repositories, e.g. YAGO, and (ii) any subset of
Modeling Relations and Their Mentions without Labeled Text
TLDR
A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB.
Multi-instance Multi-label Learning for Relation Extraction
TLDR
This work proposes a novel approach to multi-instance multi-label learning for RE, which jointly models all the instances of a pair of entities in text and all their labels using a graphical model with latent variables that performs competitively on two difficult domains.
Distant supervision for relation extraction without labeled data
TLDR
This work investigates an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size.
Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations
TLDR
A novel approach for multi-instance learning with overlapping relations that combines a sentence-level extraction model with a simple, corpus-level component for aggregating the individual facts is presented.
Connecting Language and Knowledge Bases with Embedding Models for Relation Extraction
This paper proposes a novel approach for relation extraction from free text which is trained to jointly use information from the text and from existing knowledge. Our model is based on scoring
Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks
TLDR
This paper proposes a novel model dubbed the Piecewise Convolutional Neural Networks (PCNNs) with multi-instance learning to address the problem of wrong label problem when using distant supervision for relation extraction and adopts convolutional architecture with piecewise max pooling to automatically learn relevant features.
Exploring Various Knowledge in Relation Extraction
TLDR
This paper investigates the incorporation of diverse lexical, syntactic and semantic knowledge in feature-based relation extraction using SVM and illustrates that the base phrase chunking information is very effective for relation extraction and contributes to most of the performance improvement from syntactic aspect while additional information from full parsing gives limited further enhancement.
A Review of Relation Extraction
TLDR
A comprehensive review of various aspects of the entity relation extraction task is presented, including some of the most important supervised and semi-supervised classification approaches to the relation extractiontask.
...
1
2
3
...