Corpus ID: 219179633

Context-based Transformer Models for Answer Sentence Selection

@article{Lauriola2020ContextbasedTM,
  title={Context-based Transformer Models for Answer Sentence Selection},
  author={Ivano Lauriola and Alessandro Moschitti},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.01285}
}
  • Ivano Lauriola, Alessandro Moschitti
  • Published 2020
  • Computer Science
  • ArXiv
  • An important task for the design of Question Answering systems is the selection of the sentence containing (or constituting) the answer from documents relevant to the asked question. Most previous work has only used the target sentence to compute its score with the question as the models were not powerful enough to also effectively encode additional contextual information. In this paper, we analyze the role of the contextual information in the sentence selection task, proposing a Transformer… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore Further: Topics Discussed in This Paper

    References

    SHOWING 1-10 OF 20 REFERENCES
    Context-Aware Answer Sentence Selection With Hierarchical Gated Recurrent Neural Networks
    • 16
    • Highly Influential
    Transformer-Based Neural Network for Answer Selection in Question Answering
    • 11
    WikiQA: A Challenge Dataset for Open-Domain Question Answering
    • 470
    • PDF
    TANDA: Transfer and Adapt Pre-Trained Transformer Models for Answer Sentence Selection
    • 31
    • PDF
    What is the Jeopardy Model? A Quasi-Synchronous Grammar for QA
    • 362
    • PDF
    Reading Wikipedia to Answer Open-Domain Questions
    • 724
    • PDF
    Natural Questions: A Benchmark for Question Answering Research
    • 249
    • PDF
    Passage Re-ranking with BERT
    • 204
    • PDF
    Improving Answer Selection and Answer Triggering using Hard Negatives
    • 2
    • PDF
    BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
    • 11,683
    • Highly Influential
    • PDF