Modeling Relations and Their Mentions without Labeled Text

@inproceedings{Riedel2010ModelingRA,
  title={Modeling Relations and Their Mentions without Labeled Text},
  author={S. Riedel and Limin Yao and A. McCallum},
  booktitle={ECML/PKDD},
  year={2010}
}
Several recent works on relation extraction have been applying the distant supervision paradigm: instead of relying on annotated text to learn how to predict relations, they employ existing knowledge bases (KBs) as source of supervision. [...] Key Method We present a novel approach to distant supervision that can alleviate this problem based on the following two ideas: First, we use a factor graph to explicitly model the decision whether two entities are related, and the decision whether this relation is…Expand
Relation Extraction with Temporal Reasoning Based on Memory Augmented Distant Supervision
TLDR
A novel neural model is proposed to incorporate both the temporal information encoding and sequential reasoning and achieves better performance in both WIKI-TIME dataset and the well-studied NYT-10 dataset. Expand
Indirect Supervision for Relation Extraction using Question-Answer Pairs
TLDR
A novel framework to leverage question-answer pairs as an indirect source of supervision for relation extraction, and adopt a novel margin-based QA loss to reduce noise in DS by exploiting semantic evidence from the QA dataset is proposed. Expand
Relation Extraction Using Supervision from Topic Knowledge of Relation Labels
TLDR
This paper mines the topic knowledge of a relation to explicitly represent the semantics of this relation, and proposes a deep matching network to precisely model the semantic similarity between a sentence-relation pair. Expand
Relation Extraction via Domain-aware Transfer Learning
TLDR
A novel approach called, Relation Extraction via Domain-aware Transfer Learning (ReTrans), to extract relation mentions from a given text corpus by exploring the experience from a large amount of existing KBs which may not be closely related to the target relation. Expand
Reducing Wrong Labels in Distant Supervision for Relation Extraction
TLDR
A novel generative model is presented that directly models the heuristic labeling process of distant supervision and predicts whether assigned labels are correct or wrong via its hidden variables. Expand
Distant-Supervised Relation Extraction with Hierarchical Attention Based on Knowledge Graph
TLDR
This study proposes a novel hierarchical attention model, named the Bi-GRU-based Knowledge Graph Attention Model (BG2KGA), which outperforms the current methods and can improve the Precision/Recall (PR) curve area by 8% to 16% compared to the state-of-the-art models. Expand
Label-Free Distant Supervision for Relation Extraction via Knowledge Graph Embedding
TLDR
A label-free distant supervision method, which makes no use of the relation labels under this inadequate assumption, but only uses the prior knowledge derived from the KG to supervise the learning of the classifier directly and softly. Expand
Improving Relation Extraction with Relational Paraphrase Sentences
TLDR
This work proposes a novel model to learn the information of diverse relation expressions via enriching diverse expressions by relational paraphrase sentences via a joint learning framework and shows that this approach is effective to improve the performance on relation extraction, even compared with a strong baseline. Expand
Noise Reduction in Distant Supervision for Relation Extraction Using Probabilistic Soft Logic
TLDR
Experimental results show that the proposed approach, compared to the original distantly supervised set, not only improves the quality of such generated training data sets, but also the performance of the final relation extraction model. Expand
Knowledge Base Population through Distant Supervision: Analysis and Improvements
Information extraction consists on getting structured information automatically from texts. Information extraction systems try to find relevant information at corpora, and return a representation ofExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 32 REFERENCES
Distant supervision for relation extraction without labeled data
TLDR
This work investigates an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size. Expand
Learning to Extract Relations from the Web using Minimal Supervision
TLDR
An existing relation extraction method is extended to handle this weaker form of supervision, and experimental results demonstrate that the approach can reliably extract relations from web documents. Expand
Generalized Expectation Criteria for Bootstrapping Extractors using Record-Text Alignment
TLDR
A conditional random field (CRF) is presented that aligns tokens of a given DB record and its realization in text and demonstrates an error reduction of 35% over a previous state-of-the-art method that uses heuristic alignments. Expand
Guiding Semi-Supervision with Constraint-Driven Learning
TLDR
The experimental results presented in the information extraction domain demonstrate that applying constraints helps the model to generate better feedback during learning, and hence the framework allows for high performance learning with significantly less training data than was possible before on these tasks. Expand
Bi-directional Joint Inference for Entity Resolution and Segmentation Using Imperatively-Defined Factor Graphs
TLDR
A highly-coupled, bi-directional approach to joint inference based on efficient Markov chain Monte Carlo sampling in a relational conditional random field and a new probabilistic programming language that leverages imperative constructs to define factor graph structure and operation is presented. Expand
Contrastive Estimation: Training Log-Linear Models on Unlabeled Data
TLDR
A novel approach, contrastive estimation, is described, which outperforms EM, is more robust to degradations of the dictionary, and can largely recover by modeling additional features. Expand
Constructing Biological Knowledge Bases by Extracting Information from Text Sources
TLDR
A research effort aimed at automatically mapping information from text sources into structured representations, such as knowledge bases, is begun, to use machine-learning methods to induce routines for extracting facts from text. Expand
Unsupervised Constraint Driven Learning For Transliteration Discovery
TLDR
A novel unsupervised constraint-driven learning algorithm for identifying named-entity (NE) transliterations in bilingual corpora and is bootstrapped using a romanization table, which shows that this resource, when used in conjunction with constraints, can efficiently identify transliteration pairs. Expand
Gene name identification and normalization using a model organism database
TLDR
These experiments indicate that the lexical resources provided by FlyBase are complete enough to achieve high recall on the gene list task, and that normalization requires accurate disambiguation; different strategies for tagging and normalization trade off recall for precision. Expand
Incorporating Non-local Information into Information Extraction Systems by Gibbs Sampling
TLDR
By using simulated annealing in place of Viterbi decoding in sequence models such as HMMs, CMMs, and CRFs, it is possible to incorporate non-local structure while preserving tractable inference. Expand
...
1
2
3
4
...