Answering questions by learning to rank - Learning to rank by answering questions

  title={Answering questions by learning to rank - Learning to rank by answering questions},
  author={George-Sebastian Pirtoaca and Traian Rebedea and Stefan Ruseti},
  • George-Sebastian Pirtoaca, Traian Rebedea, Stefan Ruseti
  • Published 2019
  • Computer Science
  • ArXiv
  • Answering multiple-choice questions in a setting in which no supporting documents are explicitly provided continues to stand as a core problem in natural language processing. The contribution of this article is two-fold. First, it describes a method which can be used to semantically rank documents extracted from Wikipedia or similar natural language corpora. Second, we propose a model employing the semantic ranking that holds the first place in two of the most popular leaderboards for answering… CONTINUE READING
    6 Citations


    Improving Retrieval-Based Question Answering with Deep Inference Models
    • 5
    • PDF
    Learning to Attend On Essential Terms: An Enhanced Retriever-Reader Model for Open-domain Question Answering
    • 16
    • PDF
    Reading Wikipedia to Answer Open-Domain Questions
    • 776
    • Highly Influential
    • PDF
    Learning What is Essential in Questions
    • 22
    • PDF
    Language Models are Unsupervised Multitask Learners
    • 2,303
    • PDF
    Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge
    • 164
    • Highly Influential
    • PDF
    Improving Deep Learning for Multiple Choice Question Answering with Candidate Contexts
    • 6
    SQuAD: 100, 000+ Questions for Machine Comprehension of Text
    • 2,495
    • Highly Influential
    • PDF
    Improving Question Answering with External Knowledge
    • 24
    • PDF