• Corpus ID: 62841637

SCEF: A Support-Confidence-aware Embedding Framework for Knowledge Graph Refinement

@article{Zhao2019SCEFAS,
  title={SCEF: A Support-Confidence-aware Embedding Framework for Knowledge Graph Refinement},
  author={Yu Zhao and Ji Liu},
  journal={ArXiv},
  year={2019},
  volume={abs/1902.06377}
}
Knowledge graph (KG) refinement mainly aims at KG completion and correction (i.e., error detection). However, most conventional KG embedding models only focus on KG completion with an unreasonable assumption that all facts in KG hold without noises, ignoring error detection which also should be significant and essential for KG refinement.In this paper, we propose a novel support-confidence-aware KG embedding framework (SCEF), which implements KG completion and correction simultaneously by… 

Figures and Tables from this paper

Contrastive Knowledge Graph Error Detection

This work proposes a novel framework - ContrAstive knowledge Graph Error Detection (CAGED), which introduces contrastive learning into KG learning and provides a novel way of modeling KG.

Efficient Knowledge Graph Validation via Cross-Graph Representation Learning

A cross-graph representation learning framework is proposed, i.e., CrossVal, which can leverage an external KG to validate the facts in the target KG efficiently and achieves the best performance compared with the state-of-the-art methods on large-scale KGs.

Pattern-Aware and Noise-Resilient Embedding Models

By introducing a new loss function that is both pattern-aware and noise-resilient, significant performance issues can be solved and the proposed loss function is model-independent which could be applied in combination with different models.

Structured query construction via knowledge graph embedding

This paper proposes a novel framework that first encodes the underlying knowledge graph into a low-dimensional embedding space by leveraging generalized local knowledge graphs and uses the learned embedding representations of the knowledge graph to compute the query structure and assemble vertices/edges into the target query.

References

SHOWING 1-10 OF 48 REFERENCES

Does William Shakespeare REALLY Write Hamlet? Knowledge Representation Learning with Confidence

A novel confidence-aware knowledge representation learning framework (CKRL), which detects possible noises in KGs while learning knowledge representations with confidence simultaneously and proposes three kinds of triple confidences considering both local and global structural information.

Knowledge Graph Embedding with Iterative Guidance from Soft Rules

Experimental results show that with rule knowledge injected iteratively, RUGE achieves significant and consistent improvements over state-of-the-art baselines; and despite their uncertainties, automatically extracted soft rules are highly beneficial to KG embedding, even those with moderate confidence levels.

Knowledge Graph Embedding: A Survey of Approaches and Applications

This article provides a systematic review of existing techniques of Knowledge graph embedding, including not only the state-of-the-arts but also those with latest trends, based on the type of information used in the embedding task.

Improving Knowledge Graph Embedding Using Simple Constraints

The potential of using very simple constraints to improve KG embedding is investigated, examining non-negativity constraints on entity representations and approximate entailment constraints on relation representations to improve model interpretability.

Knowledge base completion by learning pairwise-interaction differentiated embeddings

A Pairwise-interaction Differentiated Embeddings model is proposed to embed entities and relations in the knowledge base to low dimensional vector representations and then predict the possible truth of additional facts to extend theknowledge base.

SSP: Semantic Space Projection for Knowledge Graph Embedding with Text Descriptions

This paper proposes the semantic space projection (SSP) model, a model which jointly learns from the symbolic triples and textual descriptions to discover semantic relevance and offer precise semantic embedding.

Modeling Relation Paths for Representation Learning of Knowledge Bases

This model considers relation paths as translations between entities for representation learning, and addresses two key challenges: (1) Since not all relation paths are reliable, it design a path-constraint resource allocation algorithm to measure the reliability of relation paths and (2) represents relation paths via semantic composition of relation embeddings.

Representation Learning of Knowledge Graphs with Hierarchical Types

Experimental results show that the proposed Type-embodied Knowledge Representation Learning models significantly outperform all baselines on both tasks, especially with long-tail distribution, and indicates that the models are capable of capturing hierarchical type information which is significant when constructing representations of knowledge graphs.

Learning Entity and Relation Embeddings for Knowledge Graph Completion

TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces by first projecting entities from entity space to corresponding relation space and then building translations between projected entities.

Knowledge Graph Embedding via Dynamic Mapping Matrix

A more fine-grained model named TransD, which is an improvement of TransR/CTransR, which not only considers the diversity of relations, but also entities, which makes it can be applied on large scale graphs.