ER: Equivariance Regularizer for Knowledge Graph Completion
@inproceedings{Cao2022ERER, title={ER: Equivariance Regularizer for Knowledge Graph Completion}, author={Zongsheng Cao and Qianqian Xu and Zhiyong Yang and Qingming Huang}, booktitle={AAAI Conference on Artificial Intelligence}, year={2022} }
Tensor factorization and distanced based models play important roles in knowledge graph completion (KGC). However, the relational matrices in KGC methods often induce a high model complexity, bearing a high risk of overfitting. As a remedy, researchers propose a variety of different regularizers such as the tensor nuclear norm regularizer. Our motivation is based on the observation that the previous work only focuses on the “size” of the parametric space, while leaving the implicit semantic…
Figures and Tables from this paper
One Citation
Temporal-structural importance weighted graph convolutional network for temporal knowledge graph completion
- Computer ScienceFuture Generation Computer Systems
- 2023
References
SHOWING 1-10 OF 44 REFERENCES
Duality-Induced Regularizer for Tensor Factorization Based Knowledge Graph Completion
- Computer ScienceNeurIPS
- 2020
A novel regularizer, DUality-induced RegulArizer (DURA) is proposed, which is not only effective in improving the performance of existing models but widely applicable to various methods.
Canonical Tensor Decomposition for Knowledge Base Completion
- Computer ScienceICML
- 2018
This work motivates and test a novel regularizer, based on tensor nuclear $p$-norms, and presents a reformulation of the problem that makes it invariant to arbitrary choices in the inclusion of predicates or their reciprocals in the dataset.
Reflections on: Knowledge Graph Fact Prediction via Knowledge-Enriched Tensor Factorization
- Computer ScienceJT@ISWC
- 2019
Regularizing Knowledge Graph Embeddings via Equivalence and Inversion Axioms
- Computer ScienceECML/PKDD
- 2017
A principled and scalable method for leveraging equivalence and inversion axioms during the learning process, by imposing a set of model-dependent soft constraints on the predicate embeddings, which consistently improves the predictive accuracy of several neural knowledge graph embedding models without compromising their scalability properties.
TuckER: Tensor Factorization for Knowledge Graph Completion
- Computer ScienceEMNLP
- 2019
This work proposes TuckER, a relatively straightforward but powerful linear model based on Tucker decomposition of the binary tensor representation of knowledge graph triples that outperforms previous state-of-the-art models across standard link prediction datasets, acting as a strong baseline for more elaborate models.
You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph Embeddings
- Computer ScienceICLR
- 2020
It is found that when trained appropriately, the relative performance differences between various model architectures often shrinks and sometimes even reverses when compared to prior results, and many of the more advanced architectures and techniques proposed in the literature should be revisited to reassess their individual benefits.
Dual Quaternion Knowledge Graph Embeddings
- Computer ScienceAAAI
- 2021
The core of DualE lies a specific design of dual-quaternion-based multiplication, which universally models relations as the compositions of a series of translation and rotation operations.
Semantically Smooth Knowledge Graph Embedding
- Computer ScienceACL
- 2015
The key idea of SSE is to take full advantage of additional semantic information and enforce theembedding space to be semantically smooth, i.e., entities belonging to the same semantic category will lie close to each other in the embedding space.
Convolutional 2D Knowledge Graph Embeddings
- Computer ScienceAAAI
- 2018
ConvE, a multi-layer convolutional network model for link prediction, is introduced, and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across all datasets.
Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs
- Computer ScienceACL
- 2019
This paper proposes a novel attention-based feature embedding that captures both entity and relation features in any given entity’s neighborhood and encapsulate relation clusters and multi-hop relations in the model.