• Publications
  • Influence
RoBERTa: A Robustly Optimized BERT Pretraining Approach
TLDR
We present a replication study of BERT pretraining (Devlin et al., 2019) that carefully measures the impact of many key hyperparameters and training data size. Expand
  • 1,760
  • 557
  • PDF
TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension
TLDR
We present TriviaQA, a challenging reading comprehension dataset containing over 650K question-answer-evidence triples. Expand
  • 486
  • 106
  • PDF
SpanBERT: Improving Pre-training by Representing and Predicting Spans
TLDR
We present SpanBERT, a pre-training method that is designed to better represent and predict spans of text. Expand
  • 259
  • 60
  • PDF
BERT for Coreference Resolution: Baselines and Analysis
TLDR
We apply BERT to coreference resolution, achieving strong improvements on the OntoNotes (+3.9 F1) and GAP (+11.5F1) benchmarks. Expand
  • 65
  • 19
  • PDF
pair2vec: Compositional Word-Pair Embeddings for Cross-Sentence Inference
TLDR
This paper proposes new methods for learning and using embeddings of word pairs that implicitly represent background knowledge about such relationships. Expand
  • 29
  • 6
  • PDF
Knowledge Graph and Corpus Driven Segmentation and Answer Inference for Telegraphic Entity-seeking Queries
TLDR
We propose a technique to segment a telegraphic query and assign a coarse-grained purpose to each segment: a base entity e1, a relation type r, a target entity type t2, and contextual words s. Expand
  • 37
  • 1
  • PDF
An Information Bottleneck Approach for Controlling Conciseness in Rationale Extraction
TLDR
We show that it is possible to better manage this trade-off by optimizing a bound on the Information Bottleneck (IB) objective. Expand
  • 4
  • 1
  • PDF
Object-Oriented Representation and Hierarchical Reinforcement Learning in Infinite Mario
TLDR
In this work, we analyze and improve upon reinforcement learning techniques used to build agents that can learn to play Infinite Mario, an action game. Expand
  • 8
  • PDF
Contextualized Representations Using Textual Encyclopedic Knowledge
TLDR
We present a method to represent input texts by contextualizing them jointly with dynamically retrieved textual encyclopedic background knowledge from multiple documents. Expand
  • 3
  • PDF
Streamlining Cross-Document Coreference Resolution: Evaluation and Modeling
TLDR
We build the first end-to-end model for Cross-document coreference resolution, which outperforms state-of-the-art results by a significant margin. Expand
  • 2
  • PDF