• Publications
  • Influence
Universal Sentence Encoder
We present models for encoding sentences into embedding vectors that specifically target transfer learning to other NLP tasks. The models are efficient and result in accurate performance on diverseExpand
  • 515
  • 100
  • PDF
Universal Sentence Encoder for English
We present easy-to-use TensorFlow Hub sentence embedding models having good task transfer performance. Model variants allow for trade-offs between accuracy and compute resources. We report theExpand
  • 152
  • 28
  • PDF
Character-Level Language Modeling with Deeper Self-Attention
LSTMs and other RNN variants have shown strong performance on character-level language modeling. These models are typically trained using truncated backpropagation through time, and it is common toExpand
  • 109
  • 14
  • PDF
Learning Semantic Textual Similarity from Conversations
We present a novel approach to learn representations for sentence-level semantic similarity using conversational data. Our method trains an unsupervised model to predict conversational input-responseExpand
  • 78
  • 10
  • PDF
Multilingual Universal Sentence Encoder for Semantic Retrieval
We introduce two pre-trained retrieval focused multilingual sentence encoding models, respectively based on the Transformer and CNN model architectures. The models embed text from 16 languages into aExpand
  • 46
  • 9
  • PDF
ReQA: An Evaluation for End-to-End Answer Retrieval Models
Popular QA benchmarks like SQuAD have driven progress on the task of identifying answer spans within a specific passage, with models now surpassing human performance. However, retrieving relevantExpand
  • 16
  • 6
  • PDF
English rise-fall-rise: a study in the semantics and pragmatics of intonation
This paper provides a semantic analysis of English rise-fall-rise (RFR) intonation as a focus quantifier over assertable alternative propositions. I locate RFR meaning in the conventional implicatureExpand
  • 66
  • 5
  • PDF
Effective Parallel Corpus Mining using Bilingual Sentence Embeddings
This paper presents an effective approach for parallel corpus mining using bilingual sentence embeddings. Our embedding models are trained to produce similar representations exclusively for bilingualExpand
  • 39
  • 4
  • PDF
Contrastive Topic: Meanings and Realizations
  • 60
  • 3
  • PDF
The pragmatics of expressive content: Evidence from large corpora
We use large collections of online product reviews, in Chinese, English, German, and Japanese, to study the use conditions of expressives (swears, antihonorifics, intensives). The distributionalExpand
  • 34
  • 2
  • PDF