Learn More
Most tasks in natural language processing can be cast into question answering (QA) problems over language input. We introduce the dynamic memory network (DMN), a unified neural network framework which processes input sequences and questions, forms semantic and episodic memories, and generates relevant answers. Questions trigger an iterative attention(More)
Attentional, RNN-based encoder-decoder models for abstractive summarization have achieved good performance on short input and output sequences. However, for longer documents and summaries, these models often include repetitive and incoherent phrases. We introduce a neural network model with intra-attention and a new training method. This method combines(More)
Coreference resolution is a complex structure prediction task which is usually solved in two steps of first detecting all mentions and then determining their coreference. Similarly to POS tagging we model mention detection as a sequence tagging problem but where the task is to predict a pair of symbols for every position corresponding to the number of(More)
  • 1