• Corpus ID: 990233

Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank

@inproceedings{Socher2013RecursiveDM,
  title={Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank},
  author={Richard Socher and Alex Perelygin and Jean Wu and Jason Chuang and Christopher D. Manning and A. Ng and Christopher Potts},
  booktitle={EMNLP},
  year={2013}
}
Semantic word spaces have been very useful but cannot express the meaning of longer phrases in a principled way. [] Key Method To address them, we introduce the Recursive Neural Tensor Network. When trained on the new treebank, this model outperforms all previous methods on several metrics. It pushes the state of the art in single sentence positive/negative classification from 80% up to 85.4%. The accuracy of predicting fine-grained sentiment labels for all phrases reaches 80.7%, an improvement of 9.7% over…

Figures and Tables from this paper

Recursive Autoencoder with HowNet Lexicon for Sentence-Level Sentiment Analysis
TLDR
A new method is proposed that train the model based on fully labeled parse tree using supervised learning without manual annotation, which not only significantly reduces the burden of manual labeling, but also allows the compositionality to capture syntactic and semantic information jointly.
Recurrent versus Recursive Approaches Towards Compositionality in Semantic Vector
TLDR
This paper introduces a new type of recurrent attention mechanism that allows us to achieve 47.4% accuracy for the rootlevel sentiment analysis task on the Stanford Sentiment Treebank, which outperforms the Recursive Neural Tensor Network’s previous 45.7% accuracy on the same dataset.
Learning vector representations for sentences: The recursive deep learning approach
TLDR
This dissertation aims at extending the RNN model by allowing information to flow in a parse tree not only bottom-up but also top-down such that both the content and context of a constituent can be recursively encoded in vectors.
Adaptive Multi-Compositionality for Recursive Neural Models with Applications to Sentiment Analysis
TLDR
The results illustrate that AdaMC significantly outperforms state-of-the-art sentiment classification methods and helps push the best accuracy of sentence-level negative/positive classification from 85.4% up to 88.5%.
Fine-Grained Sentiment Rating of Online Reviews with Deep-RNN
TLDR
A fine-grained sentiment rating of online reviews based on Deep-RNN is proposed and investigates the effect of tuning hyper-parameters on the performance of the network.
Deep Convolutional Neural Networks for Sentiment Analysis of Short Texts
TLDR
A new deep convolutional neural network is proposed that exploits from characterto sentence-level information to perform sentiment analysis of short texts and achieves state-of-the-art results for single sentence sentiment prediction.
Encoding Syntactic Knowledge in Neural Networks for Sentiment Classification
TLDR
This article proposes to learn tag-specific composition functions and tag embeddings in recursive neural networks, and proposes to utilize POS tags to control the gates of tree-structured LSTM networks.
Adaptive Semantic Compositionality for Sentence Modelling
TLDR
A parameterized compositional switch is introduced, which outputs a scalar to adaptively determine whether the meaning of a phrase should be composed of its two constituents, and is evaluated on five datasets of sentiment classification.
Semantic Compositionality through Recursive Matrix-Vector Spaces
TLDR
A recursive neural network model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length and can learn the meaning of operators in propositional logic and natural language is introduced.
Embedding Projection for Targeted Cross-Lingual Sentiment: Model Comparisons and a Real-World Study
TLDR
A cross-lingual approach to sentiment analysis that is applicable to under-resourced languages and takes into account target-level information is proposed, showing state-of-the-art performance at sentence-level when combined with machine translation.
...
...

References

SHOWING 1-10 OF 55 REFERENCES
Semantic Compositionality through Recursive Matrix-Vector Spaces
TLDR
A recursive neural network model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length and can learn the meaning of operators in propositional logic and natural language is introduced.
Learning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks
TLDR
A recursive neural network architecture for jointly parsing natural language and learning vector space representations for variable-sized inputs and captures semantic information: For instance, the phrases “decline to comment” and “would not disclose the terms” are close by in the induced embedding space.
Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions
TLDR
A novel machine learning framework based on recursive autoencoders for sentence-level prediction of sentiment label distributions that outperform other state-of-the-art approaches on commonly used datasets, without using any pre-defined sentiment lexica or polarity shifting rules.
Compositional Matrix-Space Models for Sentiment Analysis
TLDR
This paper presents the first such algorithm for learning a matrix-space model for semantic composition, and its experimental results show statistically significant improvements in performance over a bag-of-words model.
Distributional Memory: A General Framework for Corpus-Based Semantics
TLDR
The Distributional Memory approach is shown to be tenable despite the constraints imposed by its multi-purpose nature, and performs competitively against task-specific algorithms recently reported in the literature for the same tasks, and against several state-of-the-art methods.
Multi-entity Sentiment Scoring
We present a compositional framework for modelling entity-level sentiment (sub)contexts, and demonstrate how holistic multi-entity polarity scoring emerges as a by-product of compositional sentiment
Dependency Tree-based Sentiment Classification using CRFs with Hidden Variables
TLDR
Experimental results of sentiment classification of Japanese and English subjective sentences using conditional random fields with hidden variables showed that the method performs better than other methods based on bag-of-features.
A unified architecture for natural language processing: deep neural networks with multitask learning
We describe a single convolutional neural network architecture that, given a sentence, outputs a host of language processing predictions: part-of-speech tags, chunks, named entity tags, semantic
Improving Word Representations via Global Context and Multiple Word Prototypes
TLDR
A new neural network architecture is presented which learns word embeddings that better capture the semantics of words by incorporating both local and global document context, and accounts for homonymy and polysemy by learning multiple embedDings per word.
Seeing Stars: Exploiting Class Relationships for Sentiment Categorization with Respect to Rating Scales
TLDR
A meta-algorithm is applied, based on a metric labeling formulation of the rating-inference problem, that alters a given n-ary classifier's output in an explicit attempt to ensure that similar items receive similar labels.
...
...