Contextual Graph Attention for Answering Logical Queries over Incomplete Knowledge Graphs

@article{Mai2019ContextualGA,
  title={Contextual Graph Attention for Answering Logical Queries over Incomplete Knowledge Graphs},
  author={Gengchen Mai and Krzysztof Janowicz and Bo Yan and Rui Zhu and Ling Cai and N. Lao},
  journal={Proceedings of the 10th International Conference on Knowledge Capture},
  year={2019}
}
  • Gengchen MaiK. Janowicz N. Lao
  • Published 23 September 2019
  • Computer Science
  • Proceedings of the 10th International Conference on Knowledge Capture
Recently, several studies have explored methods for using KG embedding to answer logical queries. These approaches either treat embedding learning and query answering as two separated learning tasks, or fail to deal with the variability of contributions from different query paths. We proposed to leverage a graph attention mechanism to handle the unequal contribution of different query paths. However, commonly used graph attention assumes that the center node embedding is provided, which is… 

Figures and Tables from this paper

Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders

This work proposes Bidirectional Query Embedding (BiQE), a method that embeds conjunctive queries with models based on bi-directional attention mechanisms, and shows that bidirectional self-attention can capture interactions among all the elements of a query graph.

Message Passing for Query Answering over Knowledge Graphs

The method can generalize from training for the single-hop, link prediction task, to answering queries with more complex structures, and a qualitative analysis reveals that the learned embeddings successfully capture the notion of different entity types.

Message Passing Query Embedding

This work proposes a more general architecture that employs a graph neural network to encode a graph representation of the query, where nodes correspond to entities and variables, and shows competitive performance against previous models for complex queries.

SE‐KGE: A location‐aware Knowledge Graph Embedding model for Geographic Question Answering and Spatial Semantic Lifting

This work proposes a location‐aware KG embedding model called SE‐KGE which directly encodes spatial information such as point coordinates or bounding boxes of geographic entities into the KGembedding space and is capable of handling different types of spatial reasoning.

Time in a Box: Advancing Knowledge Graph Completion with Temporal Scopes

A new knowledge base embedding framework is established, called TIME2BOX, that can deal with atemporal and temporal statements of different types simultaneously and outperforms state-of-the-art (SOTA) methods on both link prediction and time prediction.

HyperQuaternionE: A hyperbolic embedding model for qualitative spatial and temporal reasoning

A hyperbolic embedding model is proposed, called HyperQuaternionE, to capture varying properties of relations (such as symmetry and anti-symmetry), to learn inversion relations and relation compositions, and to model hierarchical structures over entities induced by transitive relations.

TIAL FEATURE DISTRIBUTIONS USING GRID CELLS

Results show that because of its multiscale representations, Space2Vec outperforms well-established ML approaches such as RBF kernels, multi-layer feed-forward nets, and tile embedding approaches for location modeling and image classification tasks.

Multi-Scale Representation Learning for Spatial Feature Distributions using Grid Cells

A representation learning model called Space2vec is proposed to encode the absolutepositions and spatial relationships of places and outperforms well-established ML approaches such as RBF kernels, multi-layer feed forward nets, and tile embedding approaches.

A review of location encoding for GeoAI: methods and applications

A formal definition of location encoding is provided, and the necessity of it for GeoAI research is discussed, and it is demonstrated that existing location encoders can be unified under one formulation framework.

Narrative Cartography with Knowledge Graphs

It is demonstrated that, by representing both the map content and the geovisualization process in KGs (an ontology), this paper can realize both data reusability and map reproducibility for narrative cartography.

References

SHOWING 1-10 OF 26 REFERENCES

Relaxing Unanswerable Geographic Questions Using A Spatially Explicit Knowledge Graph Embedding Model

A spatially explicit translational knowledge graph embedding model called TransGeo is presented which utilizes an edge-weighted PageRank and sampling strategy to encode the distance decay into the embeddingmodel training process and is applied to relax and rewrite unanswerable geographic questions.

Modeling Relational Data with Graph Convolutional Networks

It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.

Embedding Logical Queries on Knowledge Graphs

This work introduces a framework to efficiently make predictions about conjunctive logical queries -- a flexible but tractable subset of first-order logic -- on incomplete knowledge graphs and demonstrates the utility of this framework in two application studies on real-world datasets with millions of relations.

Knowledge Graph Embedding via Dynamic Mapping Matrix

A more fine-grained model named TransD, which is an improvement of TransR/CTransR, which not only considers the diversity of relations, but also entities, which makes it can be applied on large scale graphs.

Learning Entity and Relation Embeddings for Knowledge Graph Completion

TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces by first projecting entities from entity space to corresponding relation space and then building translations between projected entities.

Knowledge Graph Embedding: A Survey of Approaches and Applications

This article provides a systematic review of existing techniques of Knowledge graph embedding, including not only the state-of-the-arts but also those with latest trends, based on the type of information used in the embedding task.

Towards Empty Answers in SPARQL: Approximating Querying with RDF Embedding

An RDF graph embedding based framework to solve the SPARQL empty-answer problem in terms of a continuous vector space and can significantly improve the quality of approximate answers and speed up the generation of alternative queries.

Knowledge Graph Representation with Jointly Structural and Textual Encoding

This paper introduces three neural models to encode the valuable information from text description of entity, among which an attentive model can select related information as needed, and proposes a novel deep architecture to utilize both structural and textual information of entities.

Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior

Semantic Parsing on Freebase from Question-Answer Pairs

This paper trains a semantic parser that scales up to Freebase and outperforms their state-of-the-art parser on the dataset of Cai and Yates (2013), despite not having annotated logical forms.