Position-aware Attention and Supervised Data Improve Slot Filling

@inproceedings{Zhang2017PositionawareAA,
  title={Position-aware Attention and Supervised Data Improve Slot Filling},
  author={Yuhao Zhang and Victor Zhong and Danqi Chen and Gabor Angeli and Christopher D. Manning},
  booktitle={EMNLP},
  year={2017}
}
Organized relational knowledge in the form of “knowledge graphs” is important for many applications. [...] Key Method We first propose an effective new model, which combines an LSTM sequence model with a form of entity position-aware attention that is better suited to relation extraction. Then we build TACRED, a large (119,474 examples) supervised relation extraction dataset, obtained via crowdsourcing and targeted towards TAC KBP relations.Expand
Distilling Knowledge from Well-Informed Soft Labels for Neural Relation Extraction
TLDR
A bipartite graph is first devised to discover type constraints between entities and relations based on the entire corpus and combined with neural networks to achieve a knowledgeable model, which makes it possible to integrate prior knowledge in relation extraction. Expand
Deep learning methods for knowledge base population
TLDR
This thesis explores deep learning methods for automatically populating knowledge bases from text, addressing the following tasks: slot filling, uncertainty detection and type-aware relation extraction, and is the first to integrate an uncertainty detection component into a slot filling pipeline. Expand
From What to Why: Improving Relation Extraction with Rationale Graph
TLDR
This paper proposes a novel RAtionale Graph (RAG) to organize co-occurrence constraints among entity types, triggers and relations in a holistic graph view, and introduces two subtasks of entity type prediction and trigger labeling to improve the performance of neural RE models. Expand
Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention
TLDR
The multiple layers of the hierarchical attention scheme provide coarse-to-fine granularity to better identify valid instances, which is especially effective for extracting those long-tail relations. Expand
Zero-shot Slot Filling with DPR and RAG
TLDR
Several strategies are described to improve the retriever and the generator of RAG in order to make it a better slot filler, which reached the top-1 position on the KILT leaderboard on both T-REx and zsRE dataset. Expand
MapRE: An Effective Semantic Mapping Approach for Low-resource Relation Extraction
  • Manqing Dong, Chunguang Pan, Zhipeng Luo
  • Computer Science
  • ArXiv
  • 2021
Neural relation extraction models have shown promising results in recent years; however, the model performance drops dramatically given only a few training samples. Recent works try leveraging theExpand
Attention Guided Graph Convolutional Networks for Relation Extraction
TLDR
Attention Guided Graph Convolutional Networks is proposed, a novel model which directly takes full dependency trees as inputs and can be understood as a soft-pruning approach that automatically learns how to selectively attend to the relevant sub-structures useful for the relation extraction task. Expand
Learning Dual Retrieval Module for Semi-supervised Relation Extraction
TLDR
This paper proposes DualRE, a principled framework that introduces a retrieval module which is jointly trained with the original relation prediction module, and proposes a key insight that retrieving sentences expressing a relation is a dual task of predicting the relation label for a given sentence. Expand
Matching the Blanks: Distributional Similarity for Relation Learning
TLDR
This paper builds on extensions of Harris’ distributional hypothesis to relations, as well as recent advances in learning text representations (specifically, BERT), to build task agnostic relation representations solely from entity-linked text. Expand
Stanford at TAC KBP 2017: Building a Trilingual Relational Knowledge Graph
TLDR
Stanford describes Stanford’s entries in the TAC KBP 2017 Cold Start Knowledge Base Population and Slot Filling challenges and makes further improvements to their systems for other languages, including improved named entity recognition, a new neural relation extractor, and better support for nested mentions and discussion forum documents. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 38 REFERENCES
Modeling Relations and Their Mentions without Labeled Text
TLDR
A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB. Expand
Overview of the TAC 2010 Knowledge Base Population Track
TLDR
An overview of the task definition and annotation challenges associated with KBP2010 is provided and the evaluation results and lessons that are learned are discussed based on detailed analysis. Expand
Multi-instance Multi-label Learning for Relation Extraction
TLDR
This work proposes a novel approach to multi-instance multi-label learning for RE, which jointly models all the instances of a pair of entities in text and all their labels using a graphical model with latent variables that performs competitively on two difficult domains. Expand
Relation Classification via Multi-Level Attention CNNs
TLDR
A novel convolutional neural network architecture is proposed that enables endto-end learning from task-specific labeled data, forgoing the need for external knowledge such as explicit dependency structures and outperforms previous state-of-the-art methods. Expand
Bootstrapped Self Training for Knowledge Base Population
TLDR
This work proposes bootstrapped selftraining to capture the benefits of both systems: the precision of patterns and the generalizability of trained models and shows that training on the output of patterns drastically improves performance over the patterns. Expand
Combining Distant and Partial Supervision for Relation Extraction
TLDR
This work presents an approach for providing partial supervision to a distantly supervised relation extractor using a small number of carefully selected examples, and proposes a novel criterion to sample examples which are both uncertain and representative. Expand
Stanford at TAC KBP 2016: Sealing Pipeline Leaks and Understanding Chinese
TLDR
An entirely new Chinese entity detection and relation extraction system for the new Chinese and cross-lingual relation extraction tracks is described, which consists of several ruled-based relation extractors and a distantly supervised extractor. Expand
Distant supervision for relation extraction without labeled data
TLDR
This work investigates an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size. Expand
Relation Extraction: Perspective from Convolutional Neural Networks
TLDR
This work introduces a convolutional neural network for relation extraction that automatically learns features from sentences and minimizes the dependence on external toolkits and resources. Expand
Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification
TLDR
The experimental results on the SemEval-2010 relation classification task show that the AttBLSTM method outperforms most of the existing methods, with only word vectors. Expand
...
1
2
3
4
...