Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

Abstract

Semantic word spaces have been very useful but cannot express the meaning of longer phrases in a principled way. Further progress towards understanding compositionality in tasks such as sentiment detection requires richer supervised training and evaluation resources and more powerful models of composition. To remedy this, we introduce a Sentiment Treebank. It includes fine grained sentiment labels for 215,154 phrases in the parse trees of 11,855 sentences and presents new challenges for sentiment composition-ality. To address them, we introduce the Recursive Neural Tensor Network. When trained on the new treebank, this model out-performs all previous methods on several met-rics. It pushes the state of the art in single sentence positive/negative classification from 80% up to 85.4%. The accuracy of predicting fine-grained sentiment labels for all phrases reaches 80.7%, an improvement of 9.7% over bag of features baselines. Lastly, it is the only model that can accurately capture the effects of negation and its scope at various tree levels for both positive and negative phrases.

Extracted Key Phrases

13 Figures and Tables

Showing 1-10 of 44 references

Learning taskdependent distributed representations by backpropagation through structure

  • C Goller, A Küchler
  • 1996
Highly Influential
2 Excerpts
Showing 1-10 of 941 extracted citations
020040020132014201520162017
Citations per Year

1,366 Citations

Semantic Scholar estimates that this publication has received between 1,244 and 1,504 citations based on the available data.

See our FAQ for additional information.