• Publications
  • Influence
Neural Summarization by Extracting Sentences and Words
TLDR
This work develops a general framework for single-document summarization composed of a hierarchical document encoder and an attention-based extractor that allows for different classes of summarization models which can extract sentences or words. Expand
Long Short-Term Memory-Networks for Machine Reading
TLDR
A machine reading simulator which processes text incrementally from left to right and performs shallow reasoning with memory and attention and extends the Long Short-Term Memory architecture with a memory network in place of a single memory cell, offering a way to weakly induce relations among tokens. Expand
Dependency Parsing as Head Selection
TLDR
This work formalizes dependency parsing as the problem of independently selecting the head of each word in a sentence by producing a distribution over possible heads for each word using features obtained from a bidirectional recurrent neural network. Expand
Contextual Text Understanding in Distributional Semantic Space
TLDR
This work proposes a new framework for generating context-aware text representations without diving into the sense space, and model the concept space shared among senses, resulting in a framework that is efficient in both computation and storage. Expand
Learning Structured Natural Language Representations for Semantic Parsing
We introduce a neural semantic parser that converts natural language utterances to intermediate representations in the form of predicate-argument structures, which are induced with a transitionExpand
Edina: Building an Open Domain Socialbot with Self-dialogues
TLDR
Edina, the University of Edinburgh's social bot, is a conversational agent whose responses utilize data harvested from Amazon Mechanical Turk through an innovative new technique the authors call self-dialogues, which addresses both coverage limitations of a strictly rule-based approach and the lack of guarantees in a strictly machine-learning approach. Expand
Learning an Executable Neural Semantic Parser
TLDR
A neural semantic parser that maps natural language utterances onto logical forms that can be executed against a task-specific environment, such as a knowledge base or a database, to produce a response. Expand
Dependency Grammar Induction with a Neural Variational Transition-based Parser
TLDR
A neural transition-based parser for dependency grammar induction, whose inference procedure utilizes rich neural features with time complexity, and achieves performance comparable to graph-based models, both on the English Penn Treebank and on the Universal Dependency Treebank. Expand
Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning
TLDR
This work detail a compositional distributional framework based on a rich form of word embeddings that aims at facilitating the interactions between words in the context of a sentence that is demonstrated on the MSRPar task. Expand
Weakly-supervised Neural Semantic Parsing with a Generative Ranker
TLDR
A neural parser-ranker system for weakly-supervised semantic parsing that generates candidate tree-structured logical forms from utterances using clues of denotations and uses a neurally encoded lexicon to inject prior domain knowledge to the model. Expand
...
1
2
...