Corpus ID: 990233

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

@inproceedings{Socher2013RecursiveDM,
  title={Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank},
  author={Richard Socher and Alex Perelygin and Jean Wu and Jason Chuang and Christopher D. Manning and A. Ng and Christopher Potts},
  booktitle={EMNLP},
  year={2013}
}
Semantic word spaces have been very useful but cannot express the meaning of longer phrases in a principled way. [...] Key Method To address them, we introduce the Recursive Neural Tensor Network. When trained on the new treebank, this model outperforms all previous methods on several metrics. It pushes the state of the art in single sentence positive/negative classification from 80% up to 85.4%. The accuracy of predicting fine-grained sentiment labels for all phrases reaches 80.7%, an improvement of 9.7% over…Expand

Figures, Tables, and Topics from this paper

Recursive Autoencoder with HowNet Lexicon for Sentence-Level Sentiment Analysis
TLDR
A new method is proposed that train the model based on fully labeled parse tree using supervised learning without manual annotation, which not only significantly reduces the burden of manual labeling, but also allows the compositionality to capture syntactic and semantic information jointly. Expand
Recurrent versus Recursive Approaches Towards Compositionality in Semantic Vector
Semantic vector spaces have long been useful for representing word tokens; however, they cannot express the meaning of longer phrases without some notion of compositionality. Recursive neural modelsExpand
Learning vector representations for sentences: The recursive deep learning approach
TLDR
This dissertation aims at extending the RNN model by allowing information to flow in a parse tree not only bottom-up but also top-down such that both the content and context of a constituent can be recursively encoded in vectors. Expand
A Sentiment Treebank and Morphologically Enriched Recursive Deep Models for Effective Sentiment Analysis in Arabic
TLDR
This article evaluates the use of deep learning advances, namely the Recursive Neural Tensor Networks (RNTN), for sentiment analysis in Arabic as a case study of MRLs and proposes the creation of the first Arabic Sentiment Treebank (ArSenTB) that is morphologically and orthographically enriched. Expand
Adaptive Multi-Compositionality for Recursive Neural Models with Applications to Sentiment Analysis
TLDR
The results illustrate that AdaMC significantly outperforms state-of-the-art sentiment classification methods and helps push the best accuracy of sentence-level negative/positive classification from 85.4% up to 88.5%. Expand
Combine HowNet lexicon to train phrase recursive autoencoder for sentence-level sentiment analysis
TLDR
Compared with RAE and some supervised methods such as support vector machine (SVM) and naive Bayesian on English and Chinese datasets, the experiment results show that CHL-PRAE can provide the best performance for sentence-level sentiment analysis. Expand
Fine-Grained Sentiment Rating of Online Reviews with Deep-RNN
TLDR
A fine-grained sentiment rating of online reviews based on Deep-RNN is proposed and investigates the effect of tuning hyper-parameters on the performance of the network. Expand
Leveraging Multi-grained Sentiment Lexicon Information for Neural Sequence Models
TLDR
A novel and general method to incorporate lexicon information, including sentiment lexicons(+/-), negation words and intensifiers, and can increase classification accuracy for neural sequence models on both SST-5 and MR dataset. Expand
Deep Convolutional Neural Networks for Sentiment Analysis of Short Texts
TLDR
A new deep convolutional neural network is proposed that exploits from characterto sentence-level information to perform sentiment analysis of short texts and achieves state-of-the-art results for single sentence sentiment prediction. Expand
Encoding Syntactic Knowledge in Neural Networks for Sentiment Classification
TLDR
This article proposes to learn tag-specific composition functions and tag embeddings in recursive neural networks, and proposes to utilize POS tags to control the gates of tree-structured LSTM networks. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 55 REFERENCES
Semantic Compositionality through Recursive Matrix-Vector Spaces
TLDR
A recursive neural network model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length and can learn the meaning of operators in propositional logic and natural language is introduced. Expand
Learning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks
Natural language parsing has typically been done with small sets of discrete categories such as NP and VP, but this representation does not capture the full syntactic nor semantic richness ofExpand
Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions
TLDR
A novel machine learning framework based on recursive autoencoders for sentence-level prediction of sentiment label distributions that outperform other state-of-the-art approaches on commonly used datasets, without using any pre-defined sentiment lexica or polarity shifting rules. Expand
Compositional Matrix-Space Models for Sentiment Analysis
TLDR
This paper presents the first such algorithm for learning a matrix-space model for semantic composition, and its experimental results show statistically significant improvements in performance over a bag-of-words model. Expand
Distributional Memory: A General Framework for Corpus-Based Semantics
TLDR
The Distributional Memory approach is shown to be tenable despite the constraints imposed by its multi-purpose nature, and performs competitively against task-specific algorithms recently reported in the literature for the same tasks, and against several state-of-the-art methods. Expand
Multi-entity Sentiment Scoring
We present a compositional framework for modelling entity-level sentiment (sub)contexts, and demonstrate how holistic multi-entity polarity scoring emerges as a by-product of compositional sentimentExpand
Dependency Tree-based Sentiment Classification using CRFs with Hidden Variables
TLDR
Experimental results of sentiment classification of Japanese and English subjective sentences using conditional random fields with hidden variables showed that the method performs better than other methods based on bag-of-features. Expand
Sentiment Composition
Sentiment classification of grammatical constituents can be explained in a quasicompositional way. The classification of a complex constituent is derived via the classification of its componentExpand
A unified architecture for natural language processing: deep neural networks with multitask learning
We describe a single convolutional neural network architecture that, given a sentence, outputs a host of language processing predictions: part-of-speech tags, chunks, named entity tags, semanticExpand
Improving Word Representations via Global Context and Multiple Word Prototypes
TLDR
A new neural network architecture is presented which learns word embeddings that better capture the semantics of words by incorporating both local and global document context, and accounts for homonymy and polysemy by learning multiple embedDings per word. Expand
...
1
2
3
4
5
...