Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering

@inproceedings{Wang2018MultiGranularityHA,
  title={Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering},
  author={Wei Wang and Ming Yan and Chen Wu},
  booktitle={ACL},
  year={2018}
}
This paper describes a novel hierarchical attention network for reading comprehension style question answering, which aims to answer questions for a given narrative paragraph. In the proposed method, attention and fusion are conducted horizontally and vertically across layers at different levels of granularity between question and paragraph. Specifically, it first encode the question and paragraph with fine-grained language embeddings, to better capture the respective representations at… CONTINUE READING

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 44 CITATIONS

Machine Reading Comprehension on SQuAD 2 . 0

Yancheng Li, Shichang Zhang
  • 2019
VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

ERNIE 2.0: A Continual Pre-training Framework for Language Understanding

  • ArXiv
  • 2019
VIEW 2 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Integrated Triaging for Fast Reading Comprehension

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

A Multiple Granularity Co-Reasoning Model for Multi-choice Reading Comprehension

  • 2019 International Joint Conference on Neural Networks (IJCNN)
  • 2019
VIEW 1 EXCERPT
CITES METHODS

References

Publications referenced by this paper.
SHOWING 1-10 OF 31 REFERENCES

Deep Contextualized Word Representations

VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL