• Publications
  • Influence
Teaching Machines to Read and Comprehend
TLDR
A new methodology is defined that resolves this bottleneck and provides large scale supervised reading comprehension data that allows a class of attention based deep neural networks that learn to read real documents and answer complex questions with minimal prior knowledge of language structure to be developed.
A Convolutional Neural Network for Modelling Sentences
TLDR
A convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) is described that is adopted for the semantic modelling of sentences and induces a feature graph over the sentence that is capable of explicitly capturing short and long-range relations.
Hybrid computing using a neural network with dynamic external memory
TLDR
A machine learning model called a differentiable neural computer (DNC), which consists of a neural network that can read from and write to an external memory matrix, analogous to the random-access memory in a conventional computer.
Reasoning about Entailment with Neural Attention
TLDR
This paper proposes a neural model that reads two sentences to determine entailment using long short-term memory units and extends this model with a word-by-word neural attention mechanism that encourages reasoning over entailments of pairs of words and phrases, and presents a qualitative analysis of attention weights produced by this model.
The NarrativeQA Reading Comprehension Challenge
TLDR
A new dataset and set of tasks in which the reader must answer questions about stories by reading entire books or movie scripts are presented, designed so that successfully answering their questions requires understanding the underlying narrative rather than relying on shallow pattern matching or salience.
Discovering Discrete Latent Topics with Neural Variational Inference
TLDR
This paper presents alternative neural approaches to topic modelling by providing parameterisable distributions over topics which permit training by backpropagation in the framework of neural variational inference, and proposes a recurrent network that is able to discover a notionally unbounded number of topics, analogous to Bayesian non-parametric topic models.
Learning Explanatory Rules from Noisy Data
TLDR
This paper proposes a Differentiable Inductive Logic framework, which can not only solve tasks which traditional ILP systems are suited for, but shows a robustness to noise and error in the training data which ILP cannot cope with.
Latent Predictor Networks for Code Generation
TLDR
A novel neural network architecture is presented which generates an output sequence conditioned on an arbitrary number of input functions and allows both the choice of conditioning context and the granularity of generation, for example characters or tokens, to be marginalised, thus permitting scalable and effective training.
Experimental Support for a Categorical Compositional Distributional Model of Meaning
TLDR
The abstract categorical model of Coecke et al. (2010) is implemented using data from the BNC and evaluated, with general improvement in results with increase in syntactic complexity showcasing the compositional power of the model.
Learning to Compose Words into Sentences with Reinforcement Learning
TLDR
Reinforcement learning is used to learn tree-structured neural networks for computing representations of natural language sentences and it is shown that while they discover some linguistically intuitive structures, they are different than conventional English syntactic structures.
...
1
2
3
4
5
...