• Publications
  • Influence
Chinese NER Using Lattice LSTM
TLDR
A lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon, is investigated, which outperforms both word-based and character-based L STM baselines. Expand
Deep Learning for Event-Driven Stock Prediction
TLDR
This work proposes a deep learning method for event-driven stock market prediction that can achieve nearly 6% improvements on S&P 500 index prediction and individual stock prediction, respectively, compared to state-of-the-art baseline methods. Expand
Using Structured Events to Predict Stock Price Movement: An Empirical Investigation
TLDR
This work proposes to adapt Open IE technology for event-based stock price movement prediction, extracting structured events from large-scale public news without manual efforts, and outperforms bags-of-words-based baselines and previous systems trained on S&P 500 stock historical data. Expand
Fast and Accurate Shift-Reduce Constituent Parsing
TLDR
This work proposes a simple yet effective extension to the shift-reduce process, which eliminates size differences between action sequences in beam-search and gives comparable accuracies to the state-of-the-art chart parsers. Expand
A Graph-to-Sequence Model for AMR-to-Text Generation
TLDR
This work introduces a neural graph-to-sequence model, using a novel LSTM structure for directly encoding graph-level semantics, and shows superior results to existing methods in the literature. Expand
Attention-based Recurrent Convolutional Neural Network for Automatic Essay Scoring
TLDR
This work builds a hierarchical sentence-document model to represent essays, using the attention mechanism to automatically decide the relative weights of words and sentences, and shows that the model outperforms the previous state-of-the-art methods. Expand
Target-Dependent Twitter Sentiment Classification with Rich Automatic Features
TLDR
This paper shows that competitive results can be achieved without the use of syntax, by extracting a rich set of automatic features from a tweet, using distributed word representations and neural pooling functions to extract features. Expand
Neural Networks for Open Domain Targeted Sentiment
TLDR
This work empirically studies the effect of word embeddings and automatic feature combinations on the open domain targeted sentiment task by extending a CRF baseline using neural networks, and proposes a novel integration of neural and discrete features which combines their relative advantages, leading to significantly higher results compared to both baselines. Expand
Gated Neural Networks for Targeted Sentiment Analysis
TLDR
A sentence-level neural model is proposed to address the limitation of pooling functions, which do not explicitly model tweet-level semantics and gives significantly higher accuracies compared to the current best method for targeted sentiment analysis. Expand
Sentence-State LSTM for Text Representation
TLDR
This work investigates an alternative LSTM structure for encoding text, which consists of a parallel state for each word, and shows that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers. Expand
...
1
2
3
4
5
...