Corpus ID: 214802223

Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders

@article{Kotnis2021AnsweringCQ,
  title={Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders},
  author={Bhushan Kotnis and Carolin Lawrence and Mathias Niepert},
  journal={ArXiv},
  year={2021},
  volume={abs/2004.02596}
}
Representation learning for knowledge graphs (KGs) has focused on the problem of answering simple link prediction queries. In this work we address the more ambitious challenge of predicting the answers of conjunctive queries with multiple missing entities. We propose Bi-Directional Query Embedding (BIQE), a method that embeds conjunctive queries with models based on bi-directional attention mechanisms. Contrary to prior work, bidirectional self-attention can capture interactions among all the… Expand
3 Citations

Figures, Tables, and Topics from this paper

Benchmarking the Combinatorial Generalizability of Complex Query Answering on Knowledge Graphs
  • Zihao Wang, Hang Yin, Yangqiu Song
  • Computer Science
  • 2021
Complex Query Answering (CQA) is an important reasoning task on knowledge graphs. Current CQA learning models have been shown to be able to generalize from atomic operators to more complex formulas,Expand
Query Embedding on Hyper-relational Knowledge Graphs
TLDR
This work proposes a method to answer hyper-relational conjunctive queries and demonstrates in the experiments that qualifiers improve query answering on a diverse set of query patterns. Expand
Regex Queries over Incomplete Knowledge Bases
We propose the novel task of answering regular expression queries (containing disjunction (∨) and Kleene plus (+) operators) over incomplete KBs. The answer set of these queries potentially has aExpand

References

SHOWING 1-10 OF 40 REFERENCES
Contextual Graph Attention for Answering Logical Queries over Incomplete Knowledge Graphs
TLDR
A multi-head attention-based end-to-end logical query answering model, called Contextual Graph Attention model (CGA), which uses an initial neighborhood aggregation layer to generate the center embedding, and the whole model is trained jointly on the original KG structure as well as the sampled query-answer pairs. Expand
Traversing Knowledge Graphs in Vector Space
TLDR
It is demonstrated that compositional training acts as a novel form of structural regularization, reliably improving performance across all base models (reducing errors by up to 43%) and achieving new state-of-the-art results. Expand
A2N: Attending to Neighbors for Knowledge Graph Inference
TLDR
A novel attention-based method to learn query-dependent representation of entities which adaptively combines the relevant graph neighborhood of an entity leading to more accurate KG completion is proposed. Expand
Modeling Relational Data with Graph Convolutional Networks
TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline. Expand
KG-BERT: BERT for Knowledge Graph Completion
TLDR
This work treats triples in knowledge graphs as textual sequences and proposes a novel framework named Knowledge Graph Bidirectional Encoder Representations from Transformer (KG-BERT) to model these triples. Expand
Context-Dependent Knowledge Graph Embedding
TLDR
Context-dependent KG embedding is proposed, a twostage scheme that takes into account both types of connectivity patterns and obtains more accurate embeddings, and achieves significant and consistent improvements over state-of-the-art methods. Expand
Embedding Logical Queries on Knowledge Graphs
TLDR
This work introduces a framework to efficiently make predictions about conjunctive logical queries -- a flexible but tractable subset of first-order logic -- on incomplete knowledge graphs and demonstrates the utility of this framework in two application studies on real-world datasets with millions of relations. Expand
Translating Embeddings for Modeling Multi-relational Data
TLDR
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases. Expand
Representing Text for Joint Embedding of Text and Knowledge Bases
TLDR
A model is proposed that captures the compositional structure of textual relations, and jointly optimizes entity, knowledge base, and textual relation representations, and significantly improves performance over a model that does not share parameters among textual relations with common sub-structure. Expand
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication. Expand
...
1
2
3
4
...