Corpus ID: 9293483

Multi-relational Learning Using Weighted Tensor Decomposition with Modular Loss

@article{London2013MultirelationalLU,
  title={Multi-relational Learning Using Weighted Tensor Decomposition with Modular Loss},
  author={Ben London and Theodoros Rekatsinas and Bert Huang and L. Getoor},
  journal={ArXiv},
  year={2013},
  volume={abs/1303.1733}
}
We propose a modular framework for multi-relational learning via tensor decomposition. In our learning setting, the training data contains multiple types of relationships among a set of objects, which we represent by a sparse three-mode tensor. The goal is to predict the values of the missing entries. To do so, we model each relationship as a function of a linear combination of latent factors. We learn this latent representation by computing a low-rank tensor decomposition, using quasi-Newton… Expand
Tensor factorization for relational learning
TLDR
It is proposed that tensor factorization can be the basis for scalable solutions for learning from relational data and present novel tensorfactorization algorithms that are particularly suited for this task. Expand
Iterative Splits of Quadratic Bounds for Scalable Binary Tensor Factorization
TLDR
This work shows that an alternative approach is to minimize the quadratic loss (root mean square error) which leads to algorithms with a training time complexity that is reduced from O(n) to O(m), as proposed earlier in the restricted case of alternating least-square algorithms. Expand
Multi-tensor Completion with Common Structures
TLDR
A novel common structure for multi-data learning is proposed, which assumes that datasets share Common Adjacency Graph (CAG) structure, which is more robust to heterogeneity and unbalance of datasets. Expand
Large-scale factorization of type-constrained multi-relational data
TLDR
This paper extends the recently proposed state-of-the-art RESCal tensor factorization to consider relational type-constraints and significantly outperforms RESCAL without type- Constraints in both, runtime and prediction quality. Expand
Zero-Truncated Poisson Tensor Factorization for Massive Binary Tensors
TLDR
A scalable Bayesian model for low-rank factorization of massive tensors with binary observations using a zero-truncated Poisson likelihood for binary data, achieving excellent computational scalability, and demonstrating its usefulness in leveraging side-information provided in form of mode-network(s). Expand
Logistic Tensor Factorization for Multi-Relational Data
TLDR
This work extends the RESCAL tensor factorization, which has shown state-of-the-art results for multi-relational learning, to account for the binary nature of adjacency tensors and shows that the logistic extension can improve the prediction results significantly. Expand
Multi-Task Metric Learning on Network Data
TLDR
A multi-task version of SPML, abbreviated as MT-SPML, which is able to learn across multiple related tasks on multiple networks via shared intermediate parametrization, and works on general networks, thus is suitable for a wide variety of problems. Expand
Using Joint Tensor Decomposition on RDF Graphs
TLDR
The goal of this thesis is to develop and evaluate models for joint tensor factorization on RDF graphs that arise from parameter estimation in statistic models or algebraic approaches and yield promising results inspite of being derived from an ad-hoc approach. Expand
Complex-Valued Embedding Models for Knowledge Graphs
TLDR
An experimentals survey of state-of-the-art factorization models, not towards a purely comparative end, but as a means to get insight about their inductive abilities, and proposes new researchdirections to improve on existing models, including ComplEx. Expand
Probabilistic Latent-Factor Database Models
We describe a general framework for modelling probabilistic databases using factorization approaches. The framework includes tensor-based approaches which have been very successful in modellingExpand
...
1
2
...

References

SHOWING 1-10 OF 25 REFERENCES
A Three-Way Model for Collective Learning on Multi-Relational Data
TLDR
This work presents a novel approach to relational learning based on the factorization of a three-way tensor that is able to perform collective learning via the latent components of the model and provide an efficient algorithm to compute the factorizations. Expand
Link Pattern Prediction with tensor decomposition in multi-relational networks
TLDR
A tensor decomposition model is proposed to solve the LPP problem, which allows to capture the correlations among different relation types and reveal the impact of various relations on prediction performance. Expand
Scalable Tensor Factorizations with Missing Data
TLDR
An algorithm called CP-WOPT (CP Weighted OPTimization) is developed using a first-order optimization approach to solve the weighted least squares problem of CANDECOMP/PARAFAC, and is shown to successfully factor tensors with noise and up to 70% missing data. Expand
Modelling Relational Data using Bayesian Clustered Tensor Factorization
TLDR
The Bayesian Clustered Tensor Factorization (BCTF) model is introduced, which embeds a factorized representation of relations in a nonparametric Bayesian clustering framework that is fully Bayesian but scales well to large data sets. Expand
Temporal Collaborative Filtering with Bayesian Probabilistic Tensor Factorization
TLDR
This work proposes a factor-based algorithm that is able to take time into account, and provides a fully Bayesian treatment to avoid tuning parameters and achieve automatic model complexity control. Expand
Temporal Link Prediction Using Matrix and Tensor Factorizations
TLDR
This article considers bipartite graphs that evolve over time and considers matrix- and tensor-based methods for predicting future links and shows that Tensor- based techniques are particularly effective for temporal data with varying periodic patterns. Expand
Link Propagation: A Fast Semi-supervised Learning Algorithm for Link Prediction
We propose Link Propagation as a new semi-supervised learning method for link prediction problems, where the task is to predict unknown parts of the network structure by using auxiliary informationExpand
Fast maximum margin matrix factorization for collaborative prediction
TLDR
This work investigates a direct gradient-based optimization method for MMMF and finds that MMMf substantially outperforms all nine methods he tested and demonstrates it on large collaborative prediction problems. Expand
Multilinear algebra for analyzing data with multiple linkages
TLDR
It is shown that multilinear algebra provides a tool for multilink analysis and is shown how the PARAFAC decomposition can be used to understand the structure of the document space and define paper-paper similarities based on multiple linkages. Expand
Collaborative Filtering on a Budget
TLDR
This paper proposes a new model for representing and compressing matrix factors via hashing that allows for essentially unbounded storage (at a graceful storage / performance trade-off) for users and items to be represented in a pre-defined memory footprint. Expand
...
1
2
3
...