Combining Distant and Direct Supervision for Neural Relation Extraction

@inproceedings{Beltagy2019CombiningDA,
  title={Combining Distant and Direct Supervision for Neural Relation Extraction},
  author={Iz Beltagy and Kyle Lo and Waleed Ammar},
  booktitle={NAACL},
  year={2019}
}
In relation extraction with distant supervision, noisy labels make it difficult to train quality models. [...] Key Method We improve such models by combining the distant supervision data with an additional directly-supervised data, which we use as supervision for the attention weights. We find that joint training on both types of supervision leads to a better model because it improves the model's ability to identify noisy sentences. In addition, we find that sigmoidal attention weights with max pooling achieves…Expand
Dual Supervision Framework for Relation Extraction with Distant Supervision and Human Annotation
Relation Extraction with Explanation
Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction
Improving Relation Extraction with Relational Paraphrase Sentences
Improving Reinforcement Learning for Neural Relation Extraction with Hierarchical Memory Extractor
...
1
2
...

References

SHOWING 1-10 OF 36 REFERENCES
Neural Relation Extraction with Selective Attention over Instances
Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks
Combining Distant and Partial Supervision for Relation Extraction
Reducing Wrong Labels in Distant Supervision for Relation Extraction
Relation Extraction with Multi-instance Multi-label Convolutional Neural Networks
Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning
Infusion of Labeled Data into Distant Supervision for Relation Extraction
Learning with Noise: Enhance Distantly Supervised Relation Extraction with Dynamic Transition Matrix
Jointly Extracting Relations with Class Ties via Effective Deep Ranking
...
1
2
3
4
...