• Publications
  • Influence
A Three-Way Model for Collective Learning on Multi-Relational Data
TLDR
This work presents a novel approach to relational learning based on the factorization of a three-way tensor that is able to perform collective learning via the latent components of the model and provide an efficient algorithm to compute the factorizations.
Holographic Embeddings of Knowledge Graphs
TLDR
Holographic embeddings are proposed to learn compositional vector space representations of entire knowledge graphs to outperform state-of-the-art methods for link prediction on knowledge graphs and relational learning benchmark datasets.
Poincaré Embeddings for Learning Hierarchical Representations
TLDR
This work introduces a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space -- or more precisely into an n-dimensional Poincare ball -- and introduces an efficient algorithm to learn the embeddings based on Riemannian optimization.
A Review of Relational Machine Learning for Knowledge Graphs
TLDR
This paper provides a review of how statistical models can be “trained” on large knowledge graphs, and then used to predict new facts about the world (which is equivalent to predicting new edges in the graph) and how such statistical models of graphs can be combined with text-based information extraction methods for automatically constructing knowledge graphs from the Web.
Factorizing YAGO: scalable machine learning for linked data
TLDR
This work presents an efficient approach to relational learning on LOD data, based on the factorization of a sparse tensor that scales to data consisting of millions of entities, hundreds of relations and billions of known facts, and shows how ontological knowledge can be incorporated in the factorizations to improve learning results and how computation can be distributed across multiple nodes.
Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry
TLDR
It is shown that an embedding in hyperbolic space can reveal important aspects of a company's organizational structure as well as reveal historical relationships between language families.
Task-Driven Modular Networks for Zero-Shot Compositional Learning
TLDR
This study focuses on the problem of compositional zero-shot classification of object-attribute categories and shows that current evaluation metrics are flawed as they only consider unseen object- attribute pairs.
Hyperbolic Graph Neural Networks
TLDR
A novel GNN architecture for learning representations on Riemannian manifolds with differentiable exponential and logarithmic maps is proposed and a scalable algorithm for modeling the structural properties of graphs is developed, comparing Euclidean and hyperbolic geometry.
Learning Visually Grounded Sentence Representations
TLDR
In this work, grounded sentence representations are investigated, where a sentence encoder is trained to predict the image features of a given caption and use the resultant features as sentence representations.
Hearst Patterns Revisited: Automatic Hypernym Detection from Large Text Corpora
TLDR
It is found that simple pattern-based methods consistently outperform distributional methods on common benchmark datasets and shows that pattern- based models provide important contextual constraints which are not yet captured in Distributional methods.
...
...