A Review of Relational Machine Learning for Knowledge Graphs

@article{Nickel2016ARO,
  title={A Review of Relational Machine Learning for Knowledge Graphs},
  author={Maximilian Nickel and Kevin P. Murphy and Volker Tresp and Evgeniy Gabrilovich},
  journal={Proceedings of the IEEE},
  year={2016},
  volume={104},
  pages={11-33}
}
Relational machine learning studies methods for the statistical analysis of relational, or graph-structured, data. [] Key Method The first is based on latent feature models such as tensor factorization and multiway neural networks. The second is based on mining observable patterns in the graph. We also show how to combine these latent and observable models to get improved modeling power at decreased computational cost. Finally, we discuss how such statistical models of graphs can be combined with text-based…

Generalized Embedding Model for Knowledge Graph Mining

This paper conjecture that the one-shot supervised learning mechanism is a bottleneck in improving the performance of the graph embedding learning, and proposes to extend this by introducing a multi-shot "unsupervised" learning framework where a 2-layer MLP network for every shot.

Determining the Number of Latent Factors in Statistical Multi-Relational Learning

The focus of this paper is to determine the number of latent factors in the RESCAL model, and design a specific pseudometric, prove the consistency of the MLEs under this pseudometric and establish its rate of convergence.

Bridging Weighted Rules and Graph Random Walks for Statistical Relational Models

This article provides a simple way to normalize relations and proves that relational logistic regression using normalized relations generalizes the path ranking algorithm, which provides a better understanding of relational learning.

Large-scale Machine Learning over Graphs

An online algorithm for multi-task learning with provable sublinear regret bound is developed, where a latent graph of task interdependencies is dynamically inferred on-the-fly, and a new approach to impose analogical structures among heterogeneous nodes is proposed.

RA-GCN: Relational Aggregation Graph Convolutional Network for Knowledge Graph Completion

This paper finds that a subset of the set of entities may be directly connected to a central entity and these similar attributes and relationships can be abstractly aggregated into virtual entities and virtual relationships, respectively, to better extract the topological relationship features.

Type-Constrained Representation Learning in Knowledge Graphs

This work integrated prior knowledge in form of type-constraints in various state of the art latent variable approaches and shows that prior knowledge on relation-types significantly improves these models up to 77% in link-prediction tasks.

Predicting the co-evolution of event and Knowledge Graphs

This paper introduces an additional set of tensors that contain temporal information that will be used to predict the events that will happen in future time steps, using for that task both dynamic information from the previous event tensors and static information that is stored in the knowledge graph.

Analysis of the Impact of Negative Sampling on Link Prediction in Knowledge Graphs

This paper uses state-of-the-art knowledge graph embeddings -- \rescal, TransE, DistMult and ComplEX -- and evaluates on benchmark datasets -- FB15k and WN18, and proposes embedding based sampling methods.

Complex-Valued Embedding Models for Knowledge Graphs

An experimentals survey of state-of-the-art factorization models, not towards a purely comparative end, but as a means to get insight about their inductive abilities, and proposes new researchdirections to improve on existing models, including ComplEx.
...

References

SHOWING 1-10 OF 156 REFERENCES

Relational Dependency Networks

This paper presents relational dependency networks (RDNs), graphical models that are capable of expressing and reasoning with such dependencies in a relational setting and outlines the relative strengths of RDNs---namely, the ability to represent cyclic dependencies, simple methods for parameter estimation, and efficient structure learning techniques.

Tensor factorization for relational learning

It is proposed that tensor factorization can be the basis for scalable solutions for learning from relational data and present novel tensorfactorization algorithms that are particularly suited for this task.

Modelling Relational Data using Bayesian Clustered Tensor Factorization

The Bayesian Clustered Tensor Factorization (BCTF) model is introduced, which embeds a factorized representation of relations in a nonparametric Bayesian clustering framework that is fully Bayesian but scales well to large data sets.

Link Prediction in Multi-relational Graphs using Additive Models

It is shown that efficient learning can be achieved using an alternating least squares approach exploiting sparse matrix algebra and low-rank approximations and a kernel solution which is of interest when it is easy to define sensible kernels.

Scaling Factorization Machines to Relational Data

This work solves the issue of standard learning algorithms based on the design matrix representation cannot scale to relational predictor variables by making use of repeating patterns in the design Matrix which stem from the underlying relational structure of the data.

A scalable approach for statistical learning in semantic graphs

This paper applies machine learning to semantic graph data and argues that scalability and robustness can be achieved via an urn-based statistical sampling scheme and applies the urn model to the SUNS framework which is based on multivariate prediction.

A latent factor model for highly multi-relational data

This paper proposes a method for modeling large multi-relational datasets, with possibly thousands of relations, based on a bilinear structure, which captures various orders of interaction of the data and also shares sparse latent factors across different relations.

Link Prediction in Relational Data

It is shown that the collective classification approach of RMNs, and the introduction of subgraph patterns over link labels, provide significant improvements in accuracy over flat classification, which attempts to predict each link in isolation.

Factorizing YAGO: scalable machine learning for linked data

This work presents an efficient approach to relational learning on LOD data, based on the factorization of a sparse tensor that scales to data consisting of millions of entities, hundreds of relations and billions of known facts, and shows how ontological knowledge can be incorporated in the factorizations to improve learning results and how computation can be distributed across multiple nodes.

Large-scale factorization of type-constrained multi-relational data

This paper extends the recently proposed state-of-the-art RESCal tensor factorization to consider relational type-constraints and significantly outperforms RESCAL without type- Constraints in both, runtime and prediction quality.
...