Discourse Parsing with Attention-based Hierarchical Neural Networks

@inproceedings{Li2016DiscoursePW,
  title={Discourse Parsing with Attention-based Hierarchical Neural Networks},
  author={Qi Li and Tianshi Li and Baobao Chang},
  booktitle={EMNLP},
  year={2016}
}
RST-style document-level discourse parsing remains a difficult task and efficient deep learning models on this task have rarely been presented. In this paper, we propose an attention-based hierarchical neural network model for discourse parsing. We also incorporate tensor-based transformation function to model complicated feature interactions. Experimental results show that our approach obtains comparable performance to the contemporary state-of-the-art systems with little manual feature… CONTINUE READING
Highly Cited
This paper has 17 citations. REVIEW CITATIONS

9 Figures & Tables

Topics

Statistics

0102020172018
Citations per Year

Citation Velocity: 10

Averaging 10 citations per year over the last 2 years.

Learn more about how we calculate this metric in our FAQ.