• Publications
  • Influence
Learning Sentiment-Specific Word Embedding for Twitter Sentiment Classification
TLDR
We present a method that learns word embedding for Twitter sentiment classification in this paper, which encodes sentiment information in the continuous representation of words. Expand
  • 967
  • 85
  • PDF
Gated Self-Matching Networks for Reading Comprehension and Question Answering
TLDR
In this paper, we present the gated self-matching networks for reading comprehension style question answering which aims to answer questions from a given passage. Expand
  • 482
  • 83
  • PDF
Unified Language Model Pre-training for Natural Language Understanding and Generation
TLDR
This paper presents a new Unified pre-trained Language Model (UniLM) that can be fine-tuned for both natural language understanding and generation tasks. Expand
  • 373
  • 61
  • PDF
Neural Question Generation from Text: A Preliminary Study
TLDR
We propose to apply neural encoder-decoder model to generate answer focused questions based on natural language sentences. Expand
  • 160
  • 46
  • PDF
Selective Encoding for Abstractive Sentence Summarization
TLDR
We propose a selective encoding model to extend the sequence-to-sequence framework for abstractive sentence summarization. Expand
  • 184
  • 29
  • PDF
Neural Document Summarization by Jointly Learning to Score and Select Sentences
TLDR
We present a novel end-to-end neural network framework for extractive document summarization by jointly learning to score and select sentences. Expand
  • 146
  • 24
  • PDF
Sentiment Embeddings with Applications to Sentiment Analysis
TLDR
We propose learning sentiment-specific word embeddings dubbed sentiment embedDings in this paper. Expand
  • 191
  • 18
  • PDF
Read + Verify: Machine Reading Comprehension with Unanswerable Questions
TLDR
We propose a novel read-then-verify system that not only utilizes a neural reader to extract candidate answers and produce no-answer probabilities, but also leverages an answer verifier to decide whether the predicted answer is entailed by the input snippets. Expand
  • 87
  • 16
  • PDF
Modeling Mention, Context and Entity with Neural Networks for Entity Disambiguation
TLDR
We model variable-sized contexts with convolutional neural network, and embed the positions of context words to factor in the distance between context word and mention. Expand
  • 146
  • 15
  • PDF
Sequence-to-Dependency Neural Machine Translation
TLDR
We propose a novel Sequence-to-Dependency Neural Machine Translation (SD-NMT) method, in which the target word sequence and its corresponding dependency structure are jointly constructed and modeled, and this structure is used as context to facilitate word generations. Expand
  • 66
  • 10
  • PDF
...
1
2
3
4
...