• Publications
  • Influence
A Three-Way Model for Collective Learning on Multi-Relational Data
TLDR
We present a novel approach to relational learning based on the factorization of a three-way tensor. Expand
Holographic Embeddings of Knowledge Graphs
TLDR
We propose holographic embeddings (HOLE) to learn compositional vector space representations of entire knowledge graphs. Expand
A Review of Relational Machine Learning for Knowledge Graphs
TLDR
We provide a review of how such statistical models can be trained on large knowledge graphs, and then used to predict new facts about the world (which is equivalent to predicting new edges in the graph). Expand
Poincaré Embeddings for Learning Hierarchical Representations
TLDR
We introduce a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space -- or more precisely into an n-dimensional Poincare ball. Expand
Factorizing YAGO: scalable machine learning for linked data
TLDR
We present an efficient approach to relational learning on LOD data, based on the factorization of a sparse tensor that scales to data consisting of millions of entities, hundreds of relations and billions of known facts. Expand
Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry
TLDR
We study different models of hyperbolic space and find that learning embeddings in the Lorentz model is substantially more efficient than in the Poincar\'e-ball model. Expand
Hearst Patterns Revisited: Automatic Hypernym Detection from Large Text Corpora
TLDR
We find that simple pattern-based methods consistently outperform distributional methods on several hypernymy tasks, including detection, direction prediction, and graded entailment ranking. Expand
Hyperbolic Graph Neural Networks
TLDR
We propose a novel GNN architecture for learning representations on Riemannian manifolds with differentiable exponential and logarithmic maps, comparing Euclidean and hyperbolic geometry. Expand
Learning Visually Grounded Sentence Representations
TLDR
We train a grounded sentence encoder that achieves good performance on COCO caption and image retrieval and subsequently show that this encoder can be transferred to various NLP tasks, with improved performance over text-only models. Expand
Task-Driven Modular Networks for Zero-Shot Compositional Learning
TLDR
We propose a task-driven modular architecture for compositional reasoning and sample efficient learning. Expand
...
1
2
3
4
5
...