• Publications
  • Influence
Teaching Machines to Read and Comprehend
TLDR
We develop a class of attention based deep neural networks that learn to read real documents and answer complex questions with minimal prior knowledge of language structure. Expand
  • 1,821
  • 307
  • PDF
A Convolutional Neural Network for Modelling Sentences
TLDR
We describe a convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) that we adopt for the semantic modelling of sentences. Expand
  • 2,551
  • 216
  • PDF
Hybrid computing using a neural network with dynamic external memory
TLDR
We introduce a differentiable neural computer (DNC), which consists of a neural network that can read from and write to an external memory matrix, analogous to the random-access memory in a conventional computer. Expand
  • 941
  • 115
  • PDF
Reasoning about Entailment with Neural Attention
TLDR
In this paper, we propose a neural model that reads two sentences to determine entailment using long short-term memory units. Expand
  • 595
  • 82
  • PDF
The NarrativeQA Reading Comprehension Challenge
TLDR
We present a new dataset and set of tasks in which the reader must answer questions about stories by reading entire books or movie scripts. Expand
  • 231
  • 55
  • PDF
Discovering Discrete Latent Topics with Neural Variational Inference
TLDR
This paper presents alternative neural approaches to topic modelling by providing parameterisable distributions over topics which permit training by backpropagation in the framework of neural variational inference. Expand
  • 111
  • 33
  • PDF
Experimental Support for a Categorical Compositional Distributional Model of Meaning
TLDR
Modelling compositional meaning for sentences using empirical distributional methods has been a challenge for computational linguists. Expand
  • 312
  • 28
  • PDF
Learning Explanatory Rules from Noisy Data
TLDR
We propose a Differentiable Inductive Logic framework, which can not only solve tasks which traditional ILP systems are suited for, but shows a robustness to noise and error in the training data which ILP cannot cope with. Expand
  • 169
  • 27
  • PDF
Latent Predictor Networks for Code Generation
TLDR
We present a novel neural network architecture which generates an output sequence conditioned on an arbitrary number of input functions. Expand
  • 219
  • 26
  • PDF
Learning to Compose Words into Sentences with Reinforcement Learning
TLDR
We use reinforcement learning to learn tree-structured neural networks for computing representations of natural language sentences. Expand
  • 110
  • 20
  • PDF