A Three-Way Model for Collective Learning on Multi-Relational Data
@inproceedings{Nickel2011ATM, title={A Three-Way Model for Collective Learning on Multi-Relational Data}, author={Maximilian Nickel and Volker Tresp and Hans-Peter Kriegel}, booktitle={International Conference on Machine Learning}, year={2011} }
Relational learning is becoming increasingly important in many areas of application. Here, we present a novel approach to relational learning based on the factorization of a three-way tensor. We show that unlike other tensor approaches, our method is able to perform collective learning via the latent components of the model and provide an efficient algorithm to compute the factorization. We substantiate our theoretical considerations regarding the collective learning capabilities of our model…
1,647 Citations
A latent factor model for highly multi-relational data
- Computer ScienceNIPS
- 2012
This paper proposes a method for modeling large multi-relational datasets, with possibly thousands of relations, based on a bilinear structure, which captures various orders of interaction of the data and also shares sparse latent factors across different relations.
Reducing the Rank in Relational Factorization Models by Including Observable Patterns
- Computer ScienceNIPS
- 2014
This work proposes a novel additive tensor factorization model to learn from latent and observable patterns on multi-relational data and presents a scalable algorithm for computing the factorization.
Translating Embeddings for Modeling Multi-relational Data
- Computer ScienceNIPS
- 2013
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases.
Tensor factorization for relational learning
- Computer Science
- 2013
It is proposed that tensor factorization can be the basis for scalable solutions for learning from relational data and present novel tensorfactorization algorithms that are particularly suited for this task.
Logistic Tensor Factorization for Multi-Relational Data
- Computer ScienceArXiv
- 2013
This work extends the RESCAL tensor factorization, which has shown state-of-the-art results for multi-relational learning, to account for the binary nature of adjacency tensors and shows that the logistic extension can improve the prediction results significantly.
Multi-relational Weighted Tensor Decomposition
- Computer Science
- 2012
This paper examines a multi-relational learning scenario in which the learner is given a small training set, sampled from the set of all potential pairwise relationships, and aims to perform transductive inference on the remaining, unknown relationships.
An Analysis of Tensor Models for Learning on Structured Data
- Computer ScienceECML/PKDD
- 2013
The tensor product is discussed as a principled way to represent structured data in vector spaces for machine learning tasks by extending known bounds for matrix factorizations to derive generalization error bounds for the tensor case.
Large-scale factorization of type-constrained multi-relational data
- Computer Science2014 International Conference on Data Science and Advanced Analytics (DSAA)
- 2014
This paper extends the recently proposed state-of-the-art RESCal tensor factorization to consider relational type-constraints and significantly outperforms RESCAL without type- Constraints in both, runtime and prediction quality.
Multi-Relational Learning at Scale with ADMM
- Computer ScienceArXiv
- 2016
This model, called ConsMRF, is a novel and scalable approach for multi-relational factorization based on consensus optimization based on the Alternating Direction Method of Multipliers framework, which enables it to optimize each target relation using a smaller set of parameters than the state-of-the-art competitors in this task.
Analogical Inference for Multi-relational Embeddings
- Computer ScienceICML
- 2017
This paper proposes a novel framework for optimizing the latent representations with respect to thealogical properties of the embedded entities and relations, and offers an elegant unification of several well-known methods in multi-relational embedding.
References
SHOWING 1-10 OF 21 REFERENCES
Relational learning via collective matrix factorization
- Computer ScienceKDD
- 2008
This model generalizes several existing matrix factorization methods, and therefore yields new large-scale optimization algorithms for these problems, which can handle any pairwise relational schema and a wide variety of error models.
Modelling Relational Data using Bayesian Clustered Tensor Factorization
- Computer ScienceNIPS
- 2009
The Bayesian Clustered Tensor Factorization (BCTF) model is introduced, which embeds a factorized representation of relations in a nonparametric Bayesian clustering framework that is fully Bayesian but scales well to large data sets.
Beyond streams and graphs: dynamic tensor analysis
- Computer ScienceKDD '06
- 2006
The dynamic tensor analysis (DTA) method, and its variants are introduced, which provides a compact summary for high-order and high-dimensional data, and it also reveals the hidden correlations.
Learning Probabilistic Relational Models
- Computer ScienceIJCAI
- 1999
This paper describes both parameter estimation and structure learning -- the automatic induction of the dependency structure in a model and shows how the learning procedure can exploit standard database retrieval techniques for efficient learning from large datasets.
Multivariate Prediction for Learning on the Semantic Web
- Computer ScienceILP
- 2010
It is argued that multivariate prediction approaches are most suitable for dealing with the resulting high-dimensional sparse data matrix and within the statistical framework, the approach scales up to large domains and is able to deal with highly sparse relationship data.
Collective Classification in Network Data
- Computer ScienceAI Mag.
- 2008
This article introduces four of the most widely used inference algorithms for classifying networked data and empirically compare them on both synthetic and real-world data.
TripleRank: Ranking Semantic Web Data by Tensor Decomposition
- Computer ScienceSEMWEB
- 2009
This paper presents TripleRank, a novel approach for faceted authority ranking in the context of RDF knowledge bases that captures the additional latent semantics of Semantic Web data by means of statistical methods in order to produce richer descriptions of the available data.
Temporal Analysis of Semantic Graphs Using ASALSAN
- Computer ScienceSeventh IEEE International Conference on Data Mining (ICDM 2007)
- 2007
The mixture of roles assigned to individuals by ASALSAN showed strong correspondence with known job classifications and revealed the patterns of communication between these roles, e.g., between top executives and the legal department, were also apparent in the solutions.
Statistical predicate invention
- Computer ScienceICML '07
- 2007
This work proposes an initial model for SPI based on second-order Markov logic, in which predicates as well as arguments can be variables, and the domain of discourse is not fully known in advance.
Tensor Decompositions and Applications
- Computer ScienceSIAM Rev.
- 2009
This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order…