• Publications
  • Influence
Supervised Learning of Universal Sentence Representations from Natural Language Inference Data
TLDR
In this paper, we show how universal sentence representations trained using the supervised data of the Stanford Natural Language Inference datasets can consistently outperform unsupervised methods like SkipThought vectors on a wide range of transfer tasks. Expand
  • 1,115
  • 250
  • PDF
Personalizing Dialogue Agents: I have a dog, do you have pets too?
TLDR
Chit-chat models are known to have several problems: they lack specificity, do not display a consistent personality and are often not very captivating. Expand
  • 392
  • 88
  • PDF
Poincaré Embeddings for Learning Hierarchical Representations
TLDR
We introduce a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space -- or more precisely into an n-dimensional Poincare ball. Expand
  • 436
  • 78
  • PDF
SentEval: An Evaluation Toolkit for Universal Sentence Representations
TLDR
We introduce SentEval, a toolkit for evaluating the quality of universal sentence representations. Expand
  • 192
  • 43
  • PDF
HyperLex: A Large-Scale Evaluation of Graded Lexical Entailment
TLDR
We introduce HyperLex—a data set and evaluation resource that quantifies the extent of the semantic category membership, that is, type-of relation, also known as hyponymy–hypernymy or lexical entailment (LE) relation between 2,616 concept pairs. Expand
  • 66
  • 20
  • PDF
Learning Image Embeddings using Convolutional Neural Networks for Improved Multi-Modal Semantics
TLDR
We construct multi-modal concept representations by concatenating a skip-gram linguistic representation vector with a visual concept representation vector computed using the feature extraction layers of a deep convolutional neural network trained on a large labeled object recognition dataset. Expand
  • 174
  • 18
  • PDF
Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry
TLDR
We study different models of hyperbolic space and find that learning embeddings in the Lorentz model is substantially more efficient than in the Poincar\'e-ball model. Expand
  • 128
  • 18
  • PDF
Adversarial NLI: A New Benchmark for Natural Language Understanding
TLDR
We introduce a new large-scale NLI benchmark dataset, collected via an iterative, adversarial human-and-model-in-the-loop procedure. Expand
  • 90
  • 18
  • PDF
Specializing Word Embeddings for Similarity or Relatedness
TLDR
We demonstrate the advantage of specializing semantic word embeddings for either similarity or relatedness for downstream NLP tasks and applications. Expand
  • 119
  • 15
  • PDF
Dynamic Meta-Embeddings for Improved Sentence Representations
TLDR
We introduce dynamic meta-embeddings, a simple yet effective method for the supervised learning of embedding ensembles, which leads to state-of-the-art performance within the same model class on a variety of tasks. Expand
  • 60
  • 15
  • PDF