Repurposing Entailment for Multi-Hop Question Answering Tasks

@inproceedings{Trivedi2019RepurposingEF,
  title={Repurposing Entailment for Multi-Hop Question Answering Tasks},
  author={Harsh Trivedi and Heeyoung Kwon and Tushar Khot and Ashish Sabharwal and Niranjan Balasubramanian},
  booktitle={NAACL-HLT},
  year={2019}
}
Question Answering (QA) naturally reduces to an entailment problem, namely, verifying whether some text entails the answer to a question. However, for multi-hop QA tasks, which require reasoning with multiple sentences, it remains unclear how best to utilize entailment models pre-trained on large scale datasets such as SNLI, which are based on sentence pairs. We introduce Multee, a general architecture that can effectively use entailment models for multi-hop QA tasks. Multee uses (i) a local… CONTINUE READING
6
Twitter Mentions

Citations

Publications citing this paper.
SHOWING 1-2 OF 2 CITATIONS

References

Publications referenced by this paper.
SHOWING 1-10 OF 30 REFERENCES

Deep contextualized word representations

VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

Enhanced LSTM for Natural Language Inference

  • ACL
  • 2016
VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL