• Publications
  • Influence
Aspect Level Sentiment Classification with Deep Memory Network
TLDR
A deep memory network for aspect level sentiment classification that explicitly captures the importance of each context word when inferring the sentiment polarity of an aspect, and is also fast. Expand
Document Modeling with Gated Recurrent Neural Network for Sentiment Classification
TLDR
A neural network model is introduced to learn vector-based document representation in a unified, bottom-up fashion and dramatically outperforms standard recurrent neural network in document modeling for sentiment classification. Expand
Effective LSTMs for Target-Dependent Sentiment Classification
TLDR
Two target dependent long short-term memory models, where target information is automatically taken into account, are developed, which achieve state-of-the-art performances without using syntactic parser or external sentiment lexicons. Expand
Learning Sentiment-Specific Word Embedding for Twitter Sentiment Classification
TLDR
Three neural networks are developed to effectively incorporate the supervision from sentiment polarity of text (e.g. sentences or tweets) in their loss functions and the performance of SSWE is improved by concatenating SSWE with existing feature set. Expand
Pre-Training with Whole Word Masking for Chinese BERT
TLDR
This technical report adapt whole word masking in Chinese text, that masking the whole word instead of masking Chinese characters, which could bring another challenge in Masked Language Model (MLM) pre-training task. Expand
Learning Semantic Representations of Users and Products for Document Level Sentiment Classification
TLDR
By combining evidence at user-, product and documentlevel in a unified neural framework, the proposed model achieves state-of-the-art performances on IMDB and Yelp datasets1. Expand
Attention-over-Attention Neural Networks for Reading Comprehension
TLDR
Experimental results show that the proposed attention-over-attention model significantly outperforms various state-of-the-art systems by a large margin in public datasets, such as CNN and Children's Book Test datasets. Expand
Deep Learning for Event-Driven Stock Prediction
TLDR
This work proposes a deep learning method for event-driven stock market prediction that can achieve nearly 6% improvements on S&P 500 index prediction and individual stock prediction, respectively, compared to state-of-the-art baseline methods. Expand
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
TLDR
This work develops CodeBERT with Transformer-based neural architecture, and trains it with a hybrid objective function that incorporates the pre-training task of replaced token detection, which is to detect plausible alternatives sampled from generators. Expand
LTP: A Chinese Language Technology Platform
TLDR
LTP (Language Technology Platform) is an integrated Chinese processing platform which includes a suite of high performance natural language processing modules and relevant corpora that achieved good results in some relevant evaluations, such as CoNLL and SemEval. Expand
...
1
2
3
4
5
...