Attention-based LSTM Network for Cross-Lingual Sentiment Classification

@inproceedings{Zhou2016AttentionbasedLN,
  title={Attention-based LSTM Network for Cross-Lingual Sentiment Classification},
  author={Xinjie Zhou and Xiaojun Wan and J. Xiao},
  booktitle={EMNLP},
  year={2016}
}
Most of the state-of-the-art sentiment classification methods are based on supervised learning algorithms which require large amounts of manually labeled data. [...] Key Method In each language, we use Long Short Term Memory (LSTM) network to model the documents, which has been proved to be very effective for word sequences. Meanwhile, we propose a hierarchical attention mechanism for the bilingual LSTM network.Expand
Jointly Learning Bilingual Sentiment and Semantic Representations for Cross-Language Sentiment Classification
TLDR
This paper proposes an approach to jointly learning bilingual semantic and sentiment representations (BSSR) for English-Chinese CLSC, which could capture rich sentiment and semantic information in BSSR learning process. Expand
Chinese-Vietnamese bilingual news unsupervised sentiment classification based on word-weighted JST algorithm
TLDR
This paper proposes an unsupervised sentiment classification method based on the word weighted Joint Sentiment/topic (JST) model algorithm, which improves the impact of emotional words in the Gibbs sampling process, and ultimately improves the accuracy of sentiment classification. Expand
Bidirectional LSTM with self-attention mechanism and multi-channel features for sentiment classification
TLDR
A bidirectional LSTM model with self-attention mechanism and multi-channel features (SAMF-BiLSTM) that can fully exploit the relationship between target words and sentiment polarity words in a sentence, and does not rely on manually organized sentiment lexicon is proposed. Expand
Emoji-Powered Representation Learning for Cross-Lingual Sentiment Classification
TLDR
A novel representation learning method is proposed that uses emoji prediction as an instrument to learn respective sentiment-aware representations for each language that are integrated to facilitate cross-lingual sentiment classification. Expand
Ensemble deep learning for aspect-based sentiment analysis
TLDR
A novel approach for aspect-based sentiment analysis which utilizes deep ensemble learning, which first builds four deep learning models, namely CNN, LSTM, BiLSTM and GRU, and the outputs of these models are combined using stacking ensemble approach. Expand
Domain-Specific Versus General-Purpose Word Representations in Sentiment Analysis for Deep Learning Models
Sentiment analysis, also known as opinion mining or emotion artificial intelligence, has been gradually becoming a hot-trend topic in the recent years due to its wide applications and also theExpand
Zero-Shot Learning for Cross-Lingual News Sentiment Classification
In this paper, we address the task of zero-shot cross-lingual news sentiment classification. Given the annotated dataset of positive, neutral, and negative news in Slovene, the aim is to develop aExpand
Cross Lingual Sentiment Analysis: A Clustering-Based Bee Colony Instance Selection and Target-Based Feature Weighting Approach
TLDR
An integrated learning model that uses a semi-supervised and an ensembled model while utilizing the available sentiment resources to tackle language divergence related issues is proposed and a clustering-based bee-colony-sample selection method for the optimal selection of most distinguishing features representing the target data is proposed. Expand
ρ-hot Lexicon Embedding-based Two-level LSTM for Sentiment Analysis
TLDR
A new encoding strategy, that is, $\rho$- hot encoding, is proposed to alleviate the drawbacks of one-hot encoding and thus effectively incorporate useful lexical cues and outperforms state-of-the-art algorithms. Expand
Sentiment Lexicon Enhanced Neural Sentiment Classification
TLDR
This paper proposes two approaches to exploit sentiment lexicons to enhance neural sentiment classification, and uses sentiment lexicon to learn sentiment-aware attentions and an auxiliary task to classify the sentiments of words in sentiment Lexicons based on their word embeddings. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 29 REFERENCES
Learning Bilingual Sentiment Word Embeddings for Cross-language Sentiment Classification
TLDR
The proposed BSWE incorporate sentiment information of text into bilingual embeddings, and can learn high-quality BSWE by simply employing labeled corpora and their translations, without relying on largescale parallel corpora. Expand
Semi-Supervised Representation Learning for Cross-Lingual Text Classification
TLDR
This paper proposes a new crosslingual adaptation approach for document classification based on learning cross-lingual discriminative distributed representations of words to maximize the loglikelihood of the documents from both language domains under aCrosslingual logbilinear document model, while minimizing the prediction log-losses of labeled documents. Expand
Co-Training for Cross-Lingual Sentiment Classification
TLDR
A cotraining approach is proposed to making use of unlabeled Chinese data for cross-lingual sentiment classification, which leverages an available English corpus for Chinese sentiment classification by using the English corpus as training data. Expand
A Mixed Model for Cross Lingual Opinion Analysis
TLDR
A mixed CLOA model is proposed, which estimates the confidence of each monolingual opinion analysis system by using their training errors through bilingual transfer self-training and co-training, respectively, by using the weighted average distances between samples and classification hyper-planes as the confidence. Expand
A Convolutional Neural Network for Modelling Sentences
TLDR
A convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) is described that is adopted for the semantic modelling of sentences and induces a feature graph over the sentence that is capable of explicitly capturing short and long-range relations. Expand
An Autoencoder Approach to Learning Bilingual Word Representations
TLDR
This work explores the use of autoencoder-based methods for cross-language learning of vectorial word representations that are coherent between two languages, while not relying on word-level alignments, and achieves state-of-the-art performance. Expand
Distributed Representations of Sentences and Documents
TLDR
Paragraph Vector is an unsupervised algorithm that learns fixed-length feature representations from variable-length pieces of texts, such as sentences, paragraphs, and documents, and its construction gives the algorithm the potential to overcome the weaknesses of bag-of-words models. Expand
Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
TLDR
The Tree-LSTM is introduced, a generalization of LSTMs to tree-structured network topologies that outperform all existing systems and strong LSTM baselines on two tasks: predicting the semantic relatedness of two sentences and sentiment classification. Expand
Convolutional Neural Networks for Sentence Classification
TLDR
The CNN models discussed herein improve upon the state of the art on 4 out of 7 tasks, which include sentiment analysis and question classification, and are proposed to allow for the use of both task-specific and static vectors. Expand
BilBOWA: Fast Bilingual Distributed Representations without Word Alignments
TLDR
It is shown that bilingual embeddings learned using the proposed BilBOWA model outperform state-of-the-art methods on a cross-lingual document classification task as well as a lexical translation task on WMT11 data. Expand
...
1
2
3
...