An Open-World Extension to Knowledge Graph Completion Models

@article{Shah2019AnOE,
  title={An Open-World Extension to Knowledge Graph Completion Models},
  author={Haseeb Shah and Johannes Villmow and Adrian Ulges and Ulrich Schwanecke and Faisal Shafait},
  journal={ArXiv},
  year={2019},
  volume={abs/1906.08382}
}
We present a novel extension to embedding-based knowledge graph completion models which enables them to perform open-world link prediction, i.e. to predict facts for entities unseen in training based on their textual description. [] Key Method After training both independently, we learn a transformation to map the embeddings of an entity’s name and description to the graph-based embedding space.In experiments on several datasets including FB20k, DBPedia50k and our new dataset FB15k-237-OWE, we demonstrate…

Figures and Tables from this paper

A Joint Training Framework for Open-World Knowledge Graph Embeddings
TLDR
FOlK (Framework for Open-World KG embeddings) is proposed a technique that jointly learnsembeddings for KG entities from descriptions and KG structure for open-world knowledge graph completion and demonstrates the effectiveness of the model in improving upon state-of-the-art baselines on several tasks.
Caps-OWKG: a capsule network model for open-world knowledge graph
TLDR
A knowledge graph representation learning model, called Caps-OWKG, which leverages the capsule network to capture the both known and unknown triplets features in open-world knowledge graph and achieves the state-of-the-art performance.
Relation Specific Transformations for Open World Knowledge Graph Completion
TLDR
An open-world knowledge graph completion model that can be combined with common closed-world approaches and enhance them to exploit text-based representations for entities unseen in training, giving substantial improvements over a relation-agnostic approach.
Extracting Short Entity Descriptions for Open-World Extension to Knowledge Graph Completion Models
TLDR
An extension to OWE is proposed, which is named OWE-MRC, to extract short expressions for entities from long descriptions by using a Machine Reading Comprehension (MRC) model, and it is indicated that the MRC model can effectively extract meaningful short descriptions.
Weighted Aggregator for the Open-World Knowledge Graph Completion
TLDR
An aggregator is proposed, adopting an attention network to get the weights of words in the entity description, and this does not upset information in the word embedding, and make the single embedding of aggregation more efficient.
A Survey on Knowledge Graph Representation Learning with Text Information
TLDR
This paper summarizes existing methods in detail and looks forward to future possible research directions on models using text as auxiliary information, dividing all text-combined models into two categories: Closed- world assumption models and Open-world assumption models.
Neighborhood aggregation based graph attention networks for open-world knowledge graph reasoning
TLDR
This work presents an attention-based method named as NAKGR, which leverages neighborhood information to generate entities and relations representations and performs well on the closed-world reasoning tasks.
Open-World Relationship Prediction
TLDR
A novel and unified model named Structured Attention Graph Neural Network (SAGNN) is proposed which can mine the new relationships outside KGs easily without relying on much external resource and performs well on the open-world relationship prediction task and even outperforms existing KGC models on the triple classification task.
Pairwise Link Prediction Model for Out of Vocabulary Knowledge Base Entities
TLDR
This article captures the interactions that exist between pairwise embeddings by means of a Pairwise Factorization Model that employs a factorization machine with relation attention, and exploits a neural bag-of-words model as the encoder, which effectively encodes word-based entities into distributed vector representations for the decoder.
Utilizing Textual Information in Knowledge Graph Embedding: A Survey of Methods and Applications
TLDR
A survey of techniques for textual information based KG embedding is proposed and the techniques for encoding the textual information to represent the entities and relations from perspectives of encoding models and scoring functions are introduced.
...
1
2
3
4
...

References

SHOWING 1-10 OF 30 REFERENCES
Open-World Knowledge Graph Completion
TLDR
This work introduces an open-world KGC model called ConMask, which learns embeddings of the entity's name and parts of its text-description to connect unseen entities to the KG and uses a relationship-dependent content masking to extract relevant snippets and then trains a fully convolutional neural network to fuse the extracted snippets with entities in the KGs.
Text-Enhanced Representation Learning for Knowledge Graph
TLDR
The rich textual context information in a text corpus is incorporated to expand the semantic structure of the knowledge graph and each relation is enabled to own different representations for different head and tail entities to better handle 1-to-N, N- to-1 and N-To-N relations.
Representation Learning of Knowledge Graphs with Entity Descriptions
TLDR
Experimental results on real-world datasets show that, the proposed novel RL method for knowledge graphs outperforms other baselines on the two tasks, especially under the zero-shot setting, which indicates that the method is capable of building representations for novel entities according to their descriptions.
Knowledge Graph Representation with Jointly Structural and Textual Encoding
TLDR
This paper introduces three neural models to encode the valuable information from text description of entity, among which an attentive model can select related information as needed, and proposes a novel deep architecture to utilize both structural and textual information of entities.
ProjE: Embedding Projection for Knowledge Graph Completion
TLDR
This work presents a shared variable neural network model called ProjE that fills-in missing information in a knowledge graph by learning joint embeddings of the knowledge graph's entities and edges, and through subtle, but important, changes to the standard loss function.
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.
Joint Learning of the Embedding of Words and Entities for Named Entity Disambiguation
TLDR
A novel embedding method specifically designed for NED that jointly maps words and entities into the same continuous vector space and extends the skip-gram model by using two models.
Translating Embeddings for Modeling Multi-relational Data
TLDR
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases.
RC-NET: A General Framework for Incorporating Knowledge into Word Representations
TLDR
This paper builds the relational knowledge and the categorical knowledge into two separate regularization functions, and combines both of them with the original objective function of the skip-gram model to obtain word representations enhanced by the knowledge graph.
Reasoning With Neural Tensor Networks for Knowledge Base Completion
TLDR
An expressive neural tensor network suitable for reasoning over relationships between two entities given a subset of the knowledge base is introduced and performance can be improved when entities are represented as an average of their constituting word vectors.
...
1
2
3
...