Answer Generation through Unified Memories over Multiple Passages

@inproceedings{Nakatsuji2020AnswerGT,
  title={Answer Generation through Unified Memories over Multiple Passages},
  author={Makoto Nakatsuji and Sohei Okui},
  booktitle={International Joint Conference on Artificial Intelligence},
  year={2020}
}
Machine reading comprehension methods that gen- erate answers by referring to multiple passages for a question have gained much attention in AI and NLP communities. The current methods, however, do not investigate the relationships among multi- ple passages in the answer generation process, even though topics correlated among the passages may be answer candidates. Our method, called neural answer Generation through Unified Memories over Multiple Passages (GUM-MP), solves this problem as follows… 
2 Citations

Figures and Tables from this paper

Conclusion-Supplement Answer Generation for Non-Factoid Questions

An ensemble network that extracts the context from the conclusion decoder's output sequence and uses it to create supplementary decoder states on the basis of an attention mechanism and generates answers that match the questions and have natural-sounding supplementary sequences in line with the context expressed by the conclusion sequence.

A Survey on Machine Reading Comprehension Systems

It is demonstrated that the focus of research has changed in recent years from answer extraction to answer generation, from single- to multi-document reading comprehension, and from learning from scratch to using pre-trained word vectors.

References

SHOWING 1-10 OF 39 REFERENCES

Multi-Passage Machine Reading Comprehension with Cross-Passage Answer Verification

An end-to-end neural model is proposed that enables answer candidates from different passages to verify each other based on their content representations and achieves the state-of-the-art performance on the English MS-MARCO dataset and the Chinese DuReader dataset.

S-Net: From Answer Extraction to Answer Generation for Machine Reading Comprehension

The answer extraction model is first employed to predict the most important sub-spans from the passage as evidence, and the answer synthesis model takes the evidence as additional features along with the question and passage to further elaborate the final answers.

S-Net: From Answer Extraction to Answer Synthesis for Machine Reading Comprehension

This paper builds the answer extraction model with state-of-the-art neural networks for single passage reading comprehension, and proposes an additional task of passage ranking to help answer extraction in multiple passages.

Improved Representation Learning for Question Answer Matching

This work develops hybrid models that process the text using both convolutional and recurrent neural networks, combining the merits on extracting linguistic information from both structures to address passage answer selection.

Conclusion-Supplement Answer Generation for Non-Factoid Questions

An ensemble network that extracts the context from the conclusion decoder's output sequence and uses it to create supplementary decoder states on the basis of an attention mechanism and generates answers that match the questions and have natural-sounding supplementary sequences in line with the context expressed by the conclusion sequence.

An End-to-End Model for Question Answering over Knowledge Base with Cross-Attention Combining Global Knowledge

This work presents an end-to-end neural network model to represent the questions and their corresponding scores dynamically according to the various candidate answer aspects via cross-attention mechanism, and leverages the global knowledge inside the underlying KB, aiming at integrating the rich KB information into the representation of the answers.

Coarse-grain Fine-grain Coattention Network for Multi-evidence Question Answering

The Coarse-grain Fine-grain Coattention Network (CFC), a new question answering model that combines information from evidence across multiple documents that obtains a new state-of-the-art result on the Qangaroo WikiHop multi-evidence question answering task.

Transformer-Based Neural Network for Answer Selection in Question Answering

A Transformer-based neural network for answer selection, where a bidirectional long short-term memory (BiLSTM) is deployed behind the Transformer to acquire both global information and sequential features in the question or answer sentence.

Get To The Point: Summarization with Pointer-Generator Networks

A novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways, using a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator.

Read + Verify: Machine Reading Comprehension with Unanswerable Questions

This work proposes a novel read-then-verify system, which not only utilizes a neural reader to extract candidate answers and produce no-answer probabilities, but also leverages an answer verifier to decide whether the predicted answer is entailed by the input snippets.