Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

@article{Du2018MultiLevelSS,
  title={Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction},
  author={Jinhua Du and Jingguang Han and Andy Way and Dadong Wan},
  journal={ArXiv},
  year={2018},
  volume={abs/1809.00699}
}
Attention mechanism is often used in deep neural networks for distantly supervised relation extraction (DS-RE) to distinguish valid from noisy instances. However, traditional 1-D vector attention model is insufficient for learning of different contexts in the selection of valid instances to predict the relationship for an entity pair. To alleviate this issue, we propose a novel multi-level structured (2-D matrix) self-attention mechanism for DS-RE in a multi-instance learning (MIL) framework… 

Figures and Tables from this paper

Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction

A brand-new light-weight neural framework to address the distantly supervised relation extraction problem and alleviate the defects in previous selective attention framework is proposed, which achieves a new state-of-the-art performance in terms of both AUC and top-n precision metrics.

Self-Attention Enhanced CNNs and Collaborative Curriculum Learning for Distantly Supervised Relation Extraction

A novel model is proposed that employs a collaborative curriculum learning framework to reduce the effects of mislabelled data and significantly outperforms baselines including the state-of-the-art in terms of P@N and PR curve metrics, thus evidencing its capability of reducing noisy effects for DSRE.

Entity and Entity Type Enhanced Capsule Network for Distant Supervision Relation Extraction

  • Hongjun HengRenjie Li
  • Computer Science
    2021 International Joint Conference on Neural Networks (IJCNN)
  • 2021
A dynamic double multi-head attention to filter out noise, and an entity supervisor to enhance the potential relation, and use capsule network to solve multi-label classification problem is proposed and achieves state-of-the-art performance on the precision-recall curve.

Incorporating Instance Correlations in Distantly Supervised Relation Extraction

This paper proposes a graph convolution network (GCN) model with an attention mechanism to improve relation extraction and demonstrates that the model significantly outperforms the compared baselines.

Distantly Supervised Relation Extraction using Multi-Layer Revision Network and Confidence-based Multi-Instance Learning

A novel Multi-Layer Revision Network (MLRN) is proposed which alleviates the effects of word-level noise by emphasizing inner-sentence correlations before extracting relevant information within sentences.

KGGCN: Knowledge-Guided Graph Convolutional Networks for Distantly Supervised Relation Extraction

Experimental results on two widely used datasets show that the proposed approach is able to efficiently use the prior knowledge from the external lexical resource and knowledge graph to enhance the performance of distantly supervised relation extraction.

Piecewise convolutional neural networks with position attention and similar bag attention for distant supervision relation extraction

A piecewise convolutional neural network with position attention and similar bag attention for distant supervision relation extraction and achieves better relation extraction accuracy than state-of-the-art methods on this dataset.

Piecewise convolutional neural networks with position attention and similar bag attention for distant supervision relation extraction

A piecewise convolutional neural network with position attention and similar bag attention for distant supervision relation extraction and achieves better relation extraction accuracy than state-of-the-art methods on this dataset.

ReadsRE: Retrieval-Augmented Distantly Supervised Relation Extraction

This work proposes a new paradigm named retrieval-augmented distantly supervised relation extraction (ReadsRE), which can incorporate large-scale open-domain knowledge (e.g., Wikipedia) into the retrieval step and seamlessly integrates a neural retriever and a relation predictor in an end-to-end framework.
...

References

SHOWING 1-10 OF 23 REFERENCES

Neural Relation Extraction with Selective Attention over Instances

A sentence-level attention-based model for relation extraction that employs convolutional neural networks to embed the semantics of sentences and dynamically reduce the weights of those noisy instances.

Multi-instance Multi-label Learning for Relation Extraction

This work proposes a novel approach to multi-instance multi-label learning for RE, which jointly models all the instances of a pair of entities in text and all their labels using a graphical model with latent variables that performs competitively on two difficult domains.

Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks

This paper proposes a novel model dubbed the Piecewise Convolutional Neural Networks (PCNNs) with multi-instance learning to address the problem of wrong label problem when using distant supervision for relation extraction and adopts convolutional architecture with piecewise max pooling to automatically learn relevant features.

Relation Classification via Multi-Level Attention CNNs

A novel convolutional neural network architecture is proposed that enables endto-end learning from task-specific labeled data, forgoing the need for external knowledge such as explicit dependency structures and outperforms previous state-of-the-art methods.

Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions

This paper proposes a sentence-level attention model to select the valid instances, which makes full use of the supervision information from knowledge bases, and extracts entity descriptions from Freebase and Wikipedia pages to supplement background knowledge for the authors' task.

Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification

The experimental results on the SemEval-2010 relation classification task show that the AttBLSTM method outperforms most of the existing methods, with only word vectors.

Modeling Relations and Their Mentions without Labeled Text

A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB.

Relation Classification via Convolutional Deep Neural Network

This paper exploits a convolutional deep neural network (DNN) to extract lexical and sentence level features from the output of pre-existing natural language processing systems and significantly outperforms the state-of-the-art methods.

Structured Attention Networks

This work shows that structured attention networks are simple extensions of the basic attention procedure, and that they allow for extending attention beyond the standard soft-selection approach, such as attending to partial segmentations or to subtrees.

Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations

A novel approach for multi-instance learning with overlapping relations that combines a sentence-level extraction model with a simple, corpus-level component for aggregating the individual facts is presented.