• Corpus ID: 53107291

Improving Distant Supervision with Maxpooled Attention and Sentence-Level Supervision

@article{Beltagy2018ImprovingDS,
  title={Improving Distant Supervision with Maxpooled Attention and Sentence-Level Supervision},
  author={Iz Beltagy and Kyle Lo and Waleed Ammar},
  journal={ArXiv},
  year={2018},
  volume={abs/1810.12956}
}
We propose an effective multitask learning setup for reducing distant supervision noise by leveraging sentence-level supervision. We show how sentence-level supervision can be used to improve the encoding of individual sentences, and to learn which input sentences are more likely to express the relationship between a pair of entities. We also introduce a novel neural architecture for collecting signals from multiple input sentences, which combines the benefits of attention and maxpooling. The… 

Figures and Tables from this paper

Structured Minimally Supervised Learning for Neural Relation Extraction

We present an approach to minimally supervised relation extraction that combines the benefits of learned representations and structured learning, and accurately predicts sentence-level relation

Improving Relation Extraction by Leveraging Knowledge Graph Link Prediction

TLDR
A multi-task learning approach is proposed that improves the performance of RE models by jointly training on RE and KGLP tasks and the generality of this approach is illustrated by applying it on several existing RE models and empirically demonstrating how it helps them achieve consistent performance gains.

Re-TACRED: A New Relation Extraction Dataset

TLDR
The authors' results show significantly higher Fleiss’ Kappa than original dataset annotations, suggesting high annotation reliability, and their revised labels yield an average model improvement of 13% f1-score, and reveal new error types obscured by the original labels.

Knowledge Graph Enhanced Relation Extraction

TLDR
This paper proposes a multi-task learning framework that enhances RE models by jointly training on both RE and KGLP tasks and illustrates the generality of this approach by applying it on three existing RE methods and achieving consistent improvements across the benchmark datasets.

Dual Supervision Framework for Relation Extraction with Distant Supervision and Human Annotation

TLDR
This work proposes the dual supervision framework which effectively utilizes both types of data, and employs two separate prediction networks HA-Net and DS-Net to predict the labels by human annotation and distant supervision, respectively to prevent the degradation of accuracy by the incorrect labeling of distant supervision.

References

SHOWING 1-10 OF 29 REFERENCES

Neural Relation Extraction with Selective Attention over Instances

TLDR
A sentence-level attention-based model for relation extraction that employs convolutional neural networks to embed the semantics of sentences and dynamically reduce the weights of those noisy instances.

Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions

TLDR
This paper proposes a sentence-level attention model to select the valid instances, which makes full use of the supervision information from knowledge bases, and extracts entity descriptions from Freebase and Wikipedia pages to supplement background knowledge for the authors' task.

Combining Distant and Partial Supervision for Relation Extraction

TLDR
This work presents an approach for providing partial supervision to a distantly supervised relation extractor using a small number of carefully selected examples, and proposes a novel criterion to sample examples which are both uncertain and representative.

Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks

TLDR
This paper proposes a novel model dubbed the Piecewise Convolutional Neural Networks (PCNNs) with multi-instance learning to address the problem of wrong label problem when using distant supervision for relation extraction and adopts convolutional architecture with piecewise max pooling to automatically learn relevant features.

Relation Extraction with Multi-instance Multi-label Convolutional Neural Networks

TLDR
This paper proposes a multi-instance multi-label convolutional neural network for distantly supervised RE, which first relaxes the expressed-at-least-once assumption, and employs cross-sentence max-pooling so as to enable information sharing across different sentences.

Joint Distant and Direct Supervision for Relation Extraction

TLDR
This paper proposes a joint model between Web data derived with DS and manually annotated data from ACE, and shows an improvement on the previous state-of-the-art in ACE and a rather good accuracy on extracting 52 types of relations from Web data, which suggests the applicability of DS for general RE.

Reducing Wrong Labels in Distant Supervision for Relation Extraction

TLDR
A novel generative model is presented that directly models the heuristic labeling process of distant supervision and predicts whether assigned labels are correct or wrong via its hidden variables.

A survey of noise reduction methods for distant supervision

We survey recent approaches to noise reduction in distant supervision learning for relation extraction. We group them according to the principles they are based on: at-least-one constraints,

Deep Residual Learning for Weakly-Supervised Relation Extraction

TLDR
In contradictory to popular beliefs that ResNet only works well for very deep networks, it is found that even with 9 layers of CNNs, using identity mapping could significantly improve the performance for distantly-supervised relation extraction.

Modeling Relations and Their Mentions without Labeled Text

TLDR
A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB.