Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

@inproceedings{Du2018MultiLevelSS,
  title={Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction},
  author={Jinhua Du and Jingguang Han and A. Way and Dadong Wan},
  booktitle={EMNLP},
  year={2018}
}
Attention mechanisms are often used in deep neural networks for distantly supervised relation extraction (DS-RE) to distinguish valid from noisy instances. However, traditional 1- D vector attention models are insufficient for the learning of different contexts in the selection of valid instances to predict the relationship for an entity pair. To alleviate this issue, we propose a novel multi-level structured (2-D matrix) self-attention mechanism for DS-RE in a multi-instance learning… Expand
Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction
ReadsRE: Retrieval-Augmented Distantly Supervised Relation Extraction
Improving Relation Extraction with Knowledge-attention
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 23 REFERENCES
Neural Relation Extraction with Selective Attention over Instances
Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks
Multi-instance Multi-label Learning for Relation Extraction
Relation Classification via Multi-Level Attention CNNs
Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification
Relation Classification via Convolutional Deep Neural Network
Modeling Relations and Their Mentions without Labeled Text
Structured Attention Networks
A Structured Self-attentive Sentence Embedding
...
1
2
3
...