• Publications
  • Influence
Learning deep representations by mutual information estimation and maximization
TLDR
This work investigates unsupervised learning of representations by maximizing mutual information between an input and the output of a deep neural network encoder. Expand
  • 676
  • 131
  • PDF
NewsQA: A Machine Comprehension Dataset
TLDR
We present NewsQA, a challenging machine comprehension dataset of over 100,000 human-generated question-answer pairs and compare it to several strong neural models. Expand
  • 387
  • 92
  • PDF
Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning
TLDR
A lot of the recent success in natural language processing (NLP) has been driven by distributed vector representations of words trained on large amounts of text in an unsupervised manner. Expand
  • 225
  • 31
  • PDF
FigureQA: An Annotated Figure Dataset for Visual Reasoning
TLDR
We introduce FigureQA, a visual reasoning corpus of over one million question-answer pairs grounded in over 100,000 images. Expand
  • 69
  • 25
  • PDF
TextWorld: A Learning Environment for Text-based Games
TLDR
We introduce TextWorld, a sandbox learning environment for the training and evaluation of RL agents on text-based games. Expand
  • 97
  • 23
  • PDF
An Empirical Study of Example Forgetting during Deep Neural Network Learning
TLDR
Inspired by the phenomenon of catastrophic forgetting, we investigate the learning dynamics of neural networks as they train on single classification tasks by analyzing (catastrophic) example forgetting events. Expand
  • 77
  • 13
  • PDF
Learning Algorithms for Active Learning
TLDR
We introduce a model that learns active learning algorithms end-to-end via metalearning. Expand
  • 98
  • 12
  • PDF
Machine Comprehension by Text-to-Text Neural Question Generation
TLDR
We propose a recurrent neural model that generates natural-language questions from documents, conditioned on answers. Expand
  • 110
  • 11
  • PDF
A Joint Model for Question Answering and Question Generation
TLDR
We propose a generative machine comprehension model that learns jointly to ask and answer questions based on documents. Expand
  • 62
  • 10
  • PDF
Building Dynamic Knowledge Graphs from Text using Machine Reading Comprehension
TLDR
We propose a neural machine-reading model that constructs dynamic knowledge graphs from procedural text, and use them to track the evolving states of participant entities. Expand
  • 50
  • 9
  • PDF
...
1
2
3
4
5
...