Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning

@inproceedings{Liu2018NeuralRE,
  title={Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning},
  author={Tianyi Liu and Xinsong Zhang and Wanhao Zhou and Weijia Jia},
  booktitle={EMNLP},
  year={2018}
}
Extracting relations is critical for knowledge base completion and construction in which distant supervised methods are widely used to extract relational facts automatically with the existing knowledge bases. However, the automatically constructed datasets comprise amounts of low-quality sentences containing noisy words, which is neglected by current distant supervised methods resulting in unacceptable precisions. To mitigate this problem, we propose a novel word-level distant supervised… Expand
Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
TLDR
A pre-trained language model, the OpenAI Generative Pre-trained Transformer (GPT), is extended to the distantly supervised setting, and it is shown that it predicts a larger set of distinct relation types with high confidence. Expand
Attention-Based Gated Convolutional Neural Networks for Distant Supervised Relation Extraction
TLDR
An attention-based gated piecewise convolutional neural networks (AGPCNNs) for distant supervised relation extraction is proposed, which can effectively reduce word-level noise by selecting the inner-sentence features. Expand
A Hybrid Model with Pre-trained Entity-Aware Transformer for Relation Extraction
TLDR
A novel hybrid model that combines Piece-wise Convolutional Neural Network (PCNN) and Entity-Aware Transformer to extract local features and learn the dependencies between distant positions jointly is proposed. Expand
Improving Neural Relation Extraction with Implicit Mutual Relations
TLDR
The proposed neural RE framework provides a promising improvement for the RE task, and significantly outperforms the state-of-the-art methods, and the component for mining implicit mutual relations is so flexible that can help to improve the performance of both CNN-based and RNN-based RE models significant. Expand
Enhanced Representations for Relations by Multitask Learning
A relation describes the relationship between a pair of entities. Relation Extraction is the process of extracting relations from free text and converting them to structured machine-readableExpand
Beyond Word Attention: Using Segment Attention in Neural Relation Extraction
TLDR
Experimental results show that the method can attend to continuous relational expressions without explicit annotations, and achieve the state-of-the-art performance on the large-scale TACRED dataset. Expand
Improving Distantly-Supervised Relation Extraction Through BERT-Based Label and Instance Embeddings
TLDR
RedSandT (Relation Extraction with Distant Supervision and Transformers), a novel distantly-supervised transformer-based RE method that manages to capture a wider set of relations through highly informative instance and label embeddings for RE by exploiting BERT’s pre-trained model, and the relationship between labels and entities, respectively is proposed. Expand
Improving Distant Supervised Relation Extraction by Dynamic Neural Network
TLDR
The proposed novel Dynamic Neural Network for Relation Extraction adopts a novel dynamic parameter generator that dynamically generates the network parameters according to the query entity types and relation classes and can simultaneously handle the style shift problem and enhance the prediction accuracy for long-tail classes. Expand
Deep Ranking Based Cost-sensitive Multi-label Learning for Distant Supervision Relation Extraction
TLDR
A general ranking based multi-label learning framework combined with convolutional neural networks, in which ranking based loss functions with regularization technique are introduced to learn the latent connections between relations. Expand
Towards Accurate and Consistent Evaluation: A Dataset for Distantly-Supervised Relation Extraction
TLDR
A new dataset NYTH is built, where the DS-generated data is used as training data and annotators are hired to label test data and the experimental results show that human-annotated data is necessary for evaluation of distantly-supervised relation extraction. Expand
...
1
2
3
4
...

References

SHOWING 1-10 OF 28 REFERENCES
Neural Relation Extraction with Selective Attention over Instances
TLDR
A sentence-level attention-based model for relation extraction that employs convolutional neural networks to embed the semantics of sentences and dynamically reduce the weights of those noisy instances. Expand
Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations
TLDR
A novel approach for multi-instance learning with overlapping relations that combines a sentence-level extraction model with a simple, corpus-level component for aggregating the individual facts is presented. Expand
Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions
TLDR
This paper proposes a sentence-level attention model to select the valid instances, which makes full use of the supervision information from knowledge bases, and extracts entity descriptions from Freebase and Wikipedia pages to supplement background knowledge for the authors' task. Expand
Modeling Relations and Their Mentions without Labeled Text
TLDR
A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB. Expand
Distant supervision for relation extraction without labeled data
TLDR
This work investigates an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size. Expand
Robust Distant Supervision Relation Extraction via Deep Reinforcement Learning
TLDR
A deep reinforcement learning strategy is explored to generate the false-positive indicator, where it is argued that incorrectly-labeled candidate sentences must be treated with a hard decision, rather than being dealt with soft attention weights. Expand
Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks
TLDR
This paper proposes a novel model dubbed the Piecewise Convolutional Neural Networks (PCNNs) with multi-instance learning to address the problem of wrong label problem when using distant supervision for relation extraction and adopts convolutional architecture with piecewise max pooling to automatically learn relevant features. Expand
A Soft-label Method for Noise-tolerant Distantly Supervised Relation Extraction
TLDR
This work introduces an entity-pair level denoise method which exploits semantic information from correctly labeled entity pairs to correct wrong labels dynamically during training, and proposes a joint score function which combines the relational scores based on the entity- Pair representation and the confidence of the hard label to obtain a new label. Expand
Global Distant Supervision for Relation Extraction
TLDR
A global distant supervision model for relation extraction is proposed, which can compensate the lack of supervision with a wide variety of indirect supervision knowledge; and reduce the uncertainty in DS by performing joint inference across relation instances. Expand
Combining Distant and Partial Supervision for Relation Extraction
TLDR
This work presents an approach for providing partial supervision to a distantly supervised relation extractor using a small number of carefully selected examples, and proposes a novel criterion to sample examples which are both uncertain and representative. Expand
...
1
2
3
...