• Corpus ID: 240070872

Modeling Heterogeneous Hierarchies with Relation-specific Hyperbolic Cones

@inproceedings{Bai2021ModelingHH,
  title={Modeling Heterogeneous Hierarchies with Relation-specific Hyperbolic Cones},
  author={Yushi Bai and Rex Ying and Hongyu Ren and Jure Leskovec},
  booktitle={NeurIPS},
  year={2021}
}
Hierarchical relations are prevalent and indispensable for organizing human knowledge captured by a knowledge graph (KG). The key property of hierarchical relations is that they induce a partial ordering over the entities, which needs to be modeled in order to allow for hierarchical reasoning. However, current KG embeddings can model only a single global hierarchy (single global partial ordering) and fail to model multiple heterogeneous hierarchies that exist in a single KG. Here we present… 
HAKG: Hierarchy-Aware Knowledge Gated Network for Recommendation
TLDR
This paper proposes a new model, called Hierarchy-Aware Knowledge Gated Network (HAKG), to tackle the aforementioned problems of existing propagation-based methods, and proposes a dual item embeddings design to represent and propagate collaborative signals and knowledge associations separately.
LP-BERT: Multi-task Pre-training Knowledge Graph BERT for Link Prediction
TLDR
This paper proposes knowledge graph BERT for link prediction, named LP-BERT, which contains two training stages: multi-task pre-training and knowledge graph fine-tuning, and carries out a triple-style negative sampling in sample batch, which greatly increased the proportion of negative sampling while keeping the training time almost unchanged.
SQUIRE: A Sequence-to-sequence Framework for Multi-hop Knowledge Graph Reasoning
TLDR
SQUIRE is presented, the first Sequence-to-sequence based multi-hop reasoning framework, which utilizes an encoder-decoder structure to translate the query to a path, and has the flexibility to complete missing edges along the path, especially in sparse KGs.
Multi-task Pre-training Language Model for Semantic Network Completion
TLDR
A triple-style negative sampling in a batch of data can significantly increase the proportion of negative sampling while retaining the training time almost unchanged, and a new data augmentation method utilizing the inverse relationship of triples is proposed to improve the performance and robustness of the model.

References

SHOWING 1-10 OF 41 REFERENCES
Low-Dimensional Hyperbolic Knowledge Graph Embeddings
TLDR
This work introduces a class of hyperbolic KG embedding models that simultaneously capture hierarchical and logical patterns in KGs and observes that different geometric transformations capture different types of relations while attention- based transformations generalize to multiple relations.
Box-To-Box Transformations for Modeling Joint Hierarchies
TLDR
A learned box-to-box transformation that respects the structure of each hierarchy is introduced and is demonstrated to improve the capability of modeling cross-hierarchy compositional edges but is also capable of generalizing from a subset of the transitive reduction.
Multi-relational Poincaré Graph Embeddings
TLDR
The Multi-Relational Poincare model (MuRP) learns relation-specific parameters to transform entity embeddings by Mobius matrix-vector multiplication and Mobius addition and outperform their Euclidean counterpart and existing embedding methods on the link prediction task, particularly at lower dimensionality.
Hyperbolic Entailment Cones for Learning Hierarchical Embeddings
TLDR
This work presents a novel method to embed directed acyclic graphs through hierarchical relations as partial orders defined using a family of nested geodesically convex cones and proves that these entailment cones admit an optimal shape with a closed form expression both in the Euclidean and hyperbolic spaces.
Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry
TLDR
It is shown that an embedding in hyperbolic space can reveal important aspects of a company's organizational structure as well as reveal historical relationships between language families.
Probabilistic Embedding of Knowledge Graphs with Box Lattice Measures
TLDR
It is shown that a broad class of models that assign probability measures to OE can never capture negative correlation, which motivates the construction of a novel box lattice and accompanying probability measure to capture anticorrelation and even disjoint concepts.
Entity Context and Relational Paths for Knowledge Graph Completion
TLDR
Developing PathCon, a knowledge graph completion method that harnesses four novel insights to outperform existing methods and provide interpretable explanations by identifying relations that provide the context and paths that are important for a given predicted relation.
RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space
TLDR
Experimental results show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.
Learning Entity and Relation Embeddings for Knowledge Graph Completion
TLDR
TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces to build translations between projected entities and to evaluate the models on three tasks including link prediction, triple classification and relational fact extraction.
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.
...
1
2
3
4
5
...