Combining Distant and Direct Supervision for Neural Relation Extraction

@inproceedings{Beltagy2019CombiningDA,
  title={Combining Distant and Direct Supervision for Neural Relation Extraction},
  author={Iz Beltagy and Kyle Lo and Waleed Ammar},
  booktitle={NAACL},
  year={2019}
}
In relation extraction with distant supervision, noisy labels make it difficult to train quality models. [...] Key Method We improve such models by combining the distant supervision data with an additional directly-supervised data, which we use as supervision for the attention weights. We find that joint training on both types of supervision leads to a better model because it improves the model's ability to identify noisy sentences. In addition, we find that sigmoidal attention weights with max pooling achieves…Expand
Dual Supervision Framework for Relation Extraction with Distant Supervision and Human Annotation
TLDR
This work proposes the dual supervision framework which effectively utilizes both types of data, and employs two separate prediction networks HA-Net and DS-Net to predict the labels by human annotation and distant supervision, respectively to prevent the degradation of accuracy by the incorrect labeling of distant supervision. Expand
Improving Distant Supervised Relation Extraction via Jointly Training on Instances
In this paper, we present the winning model for Bag Track in Inter-Personal Relation Extraction. We incorporate BERT, a large pre-trained language model with multi-instance learning for bag-levelExpand
Relation Extraction with Explanation
TLDR
This work annotates a test set with ground-truth sentence-level explanations to evaluate the quality of explanations afforded by the relation extraction models and demonstrates that replacing the entity mentions in the sentences with their fine-grained entity types not only enhances extraction accuracy but also improves explanation. Expand
Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction
TLDR
A brand-new light-weight neural framework to address the distantly supervised relation extraction problem and alleviate the defects in previous selective attention framework is proposed, which achieves a new state-of-the-art performance in terms of both AUC and top-n precision metrics. Expand
Improving Relation Extraction with Relational Paraphrase Sentences
TLDR
This work proposes a novel model to learn the information of diverse relation expressions via enriching diverse expressions by relational paraphrase sentences via a joint learning framework and shows that this approach is effective to improve the performance on relation extraction, even compared with a strong baseline. Expand
Improving Distant Supervised Relation Extraction with Noise Detection Strategy
Distant supervised relation extraction (DSRE) is widely used to extract novel relational facts from plain text, so as to improve the knowledge graph. However, distant supervision inevitably suffersExpand
SENT: Sentence-level Distant Relation Extraction via Negative Training
TLDR
This work proposes the use of negative training (NT), in which a model is trained using complementary labels regarding that “the instance does not belong to these complementary labels”, since the probability of selecting a true label as a complementary label is low, which provides less noisy information. Expand
Improving Reinforcement Learning for Neural Relation Extraction with Hierarchical Memory Extractor
TLDR
This paper designs a novel reward function to obtain feedback from both correct and noisy data, and proposes the hierarchical memory extractor (HME), which utilizes the gating mechanism to share the semantics from correlative instances between data-rich and data-poor classes. Expand
Feature-Level Attention Based Sentence Encoding for Neural Relation Extraction
TLDR
This paper proposes a feature-level attention model to encode sentences, which tries to reveal the different effects of features for relation prediction, and demonstrates that scaled dot product attention is better than others. Expand
RDSGAN: Rank-based Distant Supervision Relation Extraction with Generative Adversarial Framework
TLDR
This framework combines soft attention and hard decision to learn the distribution of true positive instances via adversarial training and selects valid instances conforming to the distribution via rank-based distant supervision, which addresses the false positive problem. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 36 REFERENCES
Neural Relation Extraction with Selective Attention over Instances
TLDR
A sentence-level attention-based model for relation extraction that employs convolutional neural networks to embed the semantics of sentences and dynamically reduce the weights of those noisy instances. Expand
Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks
TLDR
This paper proposes a novel model dubbed the Piecewise Convolutional Neural Networks (PCNNs) with multi-instance learning to address the problem of wrong label problem when using distant supervision for relation extraction and adopts convolutional architecture with piecewise max pooling to automatically learn relevant features. Expand
Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions
TLDR
This paper proposes a sentence-level attention model to select the valid instances, which makes full use of the supervision information from knowledge bases, and extracts entity descriptions from Freebase and Wikipedia pages to supplement background knowledge for the authors' task. Expand
Combining Distant and Partial Supervision for Relation Extraction
TLDR
This work presents an approach for providing partial supervision to a distantly supervised relation extractor using a small number of carefully selected examples, and proposes a novel criterion to sample examples which are both uncertain and representative. Expand
Reducing Wrong Labels in Distant Supervision for Relation Extraction
TLDR
A novel generative model is presented that directly models the heuristic labeling process of distant supervision and predicts whether assigned labels are correct or wrong via its hidden variables. Expand
Relation Extraction with Multi-instance Multi-label Convolutional Neural Networks
TLDR
This paper proposes a multi-instance multi-label convolutional neural network for distantly supervised RE, which first relaxes the expressed-at-least-once assumption, and employs cross-sentence max-pooling so as to enable information sharing across different sentences. Expand
Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning
TLDR
This work proposes a novel word-level distant supervised approach for relation extraction that is effective and improves the area of Precision/Recall(PR) from 0.35 to 0.39 over the state-of-the-art work. Expand
Infusion of Labeled Data into Distant Supervision for Relation Extraction
TLDR
This paper demonstrates how a state-of-theart multi-instance multi-label model can be modified to make use of reliable sentence-level labels in addition to the relation-level distant supervision from a database. Expand
Learning with Noise: Enhance Distantly Supervised Relation Extraction with Dynamic Transition Matrix
TLDR
It is shown that the dynamic transition matrix can effectively characterize the noise in the training data built by distant supervision and can be effectively trained using a novel curriculum learning based method without any direct supervision about the noise. Expand
Jointly Extracting Relations with Class Ties via Effective Deep Ranking
TLDR
Experiments show that leveraging class ties will enhance extraction and demonstrate the effectiveness of the model to learn class ties, and the model outperforms the baselines significantly, achieving state-of-the-art performance. Expand
...
1
2
3
4
...