When Are Tree Structures Necessary for Deep Learning of Representations?

@inproceedings{Li2015WhenAT,
  title={When Are Tree Structures Necessary for Deep Learning of Representations?},
  author={Jiwei Li and Thang Luong and Daniel Jurafsky and Eduard H. Hovy},
  booktitle={EMNLP},
  year={2015}
}
Recursive neural models, which use syntactic parse trees to recursively generate representations bottom-up, are a popular architecture. But there have not been rigorous evaluations showing for exactly which tasks this syntax-based method is appropriate. In this paper we benchmark recursive neural models against sequential recurrent neural models (simple recurrent and LSTM models). We investigate 4 tasks: (1) sentiment classification at the sentence level and phrase level; (2) matching questions… CONTINUE READING
Highly Influential
This paper has highly influenced a number of papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 159 citations. REVIEW CITATIONS

10 Figures & Tables

Topics

Statistics

02040602015201620172018
Citations per Year

159 Citations

Semantic Scholar estimates that this publication has 159 citations based on the available data.

See our FAQ for additional information.

  • GitHub repos referencing this paper

  • Presentations referencing similar topics