Relation Discovery with Out-of-Relation Knowledge Base as Supervision

  title={Relation Discovery with Out-of-Relation Knowledge Base as Supervision},
  author={Yan-jun Liang and Xin Liu and Jianwen Zhang and Yangqiu Song},
  booktitle={North American Chapter of the Association for Computational Linguistics},
Unsupervised relation discovery aims to discover new relations from a given text corpus without annotated data. However, it does not consider existing human annotated knowledge bases even when they are relevant to the relations to be discovered. In this paper, we study the problem of how to use out-of-relation knowledge bases to supervise the discovery of unseen relations, where out-of-relation means that relations to discover from the text corpus and those in knowledge bases are not overlapped… 

Figures and Tables from this paper

Weakly Supervised Text Classification using Supervision Signals from a Language Model

A latent variable model is proposed to learn a word distribution learner which associates generated words to pre-defined categories and a document classifier simultaneously without using any annotated data.

An Adversarial Transfer Network for Knowledge Representation Learning

An adversarial embedding transfer network ATransN is proposed, which transfers knowledge from one or more teacher knowledge graphs to a target one through an aligned entity set without explicit data leakage.

Semantic Relations and Deep Learning

A new Chapter 5 of the book discusses relation classification/extraction in the deep-learning paradigm which arose after the first edition appeared.



Modeling Relations and Their Mentions without Labeled Text

A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB.

Filling Knowledge Base Gaps for Distant Supervision of Relation Extraction

This work proposes a simple yet novel framework that combines a passage retrieval model using coarse features into a state-of-the-art relation extractor using multi-instance learning with fine features, and adapts the information retrieval technique of pseudorelevance feedback to expand knowledge bases.

Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations

A novel approach for multi-instance learning with overlapping relations that combines a sentence-level extraction model with a simple, corpus-level component for aggregating the individual facts is presented.

In-domain Relation Discovery with Meta-constraints via Posterior Regularization

Across two domains the approach successfully recovers hidden relation structure, comparable to or outperforming previous state-of-the-art approaches, and it is found that a small set of constraints is applicable across the domains, and that using domain-specific constraints can further improve performance.

Neural Relation Extraction with Selective Attention over Instances

A sentence-level attention-based model for relation extraction that employs convolutional neural networks to embed the semantics of sentences and dynamically reduce the weights of those noisy instances.

Incorporating Relation Paths in Neural Relation Extraction

A path-based neural relation extraction model to encode the relational semantics from both direct sentences and inference chains is proposed, and achieves significant and consistent improvements on relation extraction as compared with strong baselines.

Structured Relation Discovery using Generative Models

A series of generative probabilistic models are proposed, broadly similar to topic models, each which generates a corpus of observed triples of entity mention pairs and the surface syntactic dependency path between them.

Multi-instance Multi-label Learning for Relation Extraction

This work proposes a novel approach to multi-instance multi-label learning for RE, which jointly models all the instances of a pair of entities in text and all their labels using a graphical model with latent variables that performs competitively on two difficult domains.

Exploiting Background Knowledge for Relation Extraction

This paper proposes methods for using knowledge and resources that are external to the target sentence, as a way to improve relation extraction by exploiting background knowledge such as relationships among the target relations, as well as by considering how target relations relate to some existing knowledge resources.

Modeling Relation Paths for Representation Learning of Knowledge Bases

This model considers relation paths as translations between entities for representation learning, and addresses two key challenges: (1) Since not all relation paths are reliable, it design a path-constraint resource allocation algorithm to measure the reliability of relation paths and (2) represents relation paths via semantic composition of relation embeddings.