Unifying Question Answering and Text Classification via Span Extraction

@article{Keskar2019UnifyingQA,
  title={Unifying Question Answering and Text Classification via Span Extraction},
  author={Nitish Shirish Keskar and Bryan McCann and Caiming Xiong and Richard Socher},
  journal={ArXiv},
  year={2019},
  volume={abs/1904.09286}
}
Even as pre-trained language encoders such as BERT are shared across many tasks, the output layers of question answering and text classification models are significantly different. Span decoders are frequently used for question answering and fixed-class, classification layers for text classification. We show that this distinction is not necessary, and that both can be unified as span extraction. A unified, span-extraction approach leads to superior or comparable performance in multi-task… CONTINUE READING

Citations

Publications citing this paper.

BAM! Born-Again Multi-Task Networks for Natural Language Understanding

  • ACL
  • 2019
VIEW 2 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

References

Publications referenced by this paper.
SHOWING 1-10 OF 33 REFERENCES

The fifth pascal recognizing textual entailment challenge

Luisa Bentivogli, Peter Clark, Ido Dagan, Danilo Giampiccolo.
  • TAC.
  • 2009
VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

The fifth pascal recognizing textual entailment challenge

Luisa Bentivogli, Peter Clark, Ido Dagan, Danilo Giampiccolo.
  • TAC.
  • 2009
VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

Attention Is All You Need

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Similar Papers

Loading similar papers…