• Publications
  • Influence
Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
TLDR
This work proposes several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentence-to-word structure, and emitting words that are rare or unseen at training time. Expand
A Structured Self-attentive Sentence Embedding
TLDR
A new model for extracting an interpretable sentence embedding by introducing self-attention is proposed, which uses a 2-D matrix to represent the embedding, with each row of the matrix attending on a different part of the sentence. Expand
Classifying Relations by Ranking with Convolutional Neural Networks
TLDR
This work proposes a new pairwise ranking loss function that makes it easy to reduce the impact of artificial classes and shows that it is more effective than CNN followed by a softmax classifier and using only word embeddings as input features is enough to achieve state-of-the-art results. Expand
Deep Convolutional Neural Networks for Sentiment Analysis of Short Texts
TLDR
A new deep convolutional neural network is proposed that exploits from characterto sentence-level information to perform sentiment analysis of short texts and achieves state-of-the-art results for single sentence sentiment prediction. Expand
Attentive Pooling Networks
TLDR
The empirical results, from three very different benchmark tasks of question answering/answer selection, demonstrate that the proposed models outperform a variety of strong baselines and achieve state-of-the-art performance in all the benchmarks. Expand
Improved Representation Learning for Question Answer Matching
TLDR
This work develops hybrid models that process the text using both convolutional and recurrent neural networks, combining the merits on extracting linguistic information from both structures to address passage answer selection. Expand
Latent Structure Perceptron with Feature Induction for Unrestricted Coreference Resolution
TLDR
A machine learning system based on large margin structure perceptron for unrestricted coreference resolution that introduces two key modeling techniques: latent coreference trees and entropy guided feature induction that achieves high performances with a linear learning algorithm. Expand
Improved Neural Relation Detection for Knowledge Base Question Answering
TLDR
A hierarchical recurrent neural network enhanced by residual learning that detects KB relations given an input question is proposed and helps the KBQA system to achieve state-of-the-art accuracy for both single-relation and multi-relation QA benchmarks. Expand
Learning Character-level Representations for Part-of-Speech Tagging
TLDR
A deep neural network is proposed that learns character-level representation of words and associate them with usual word representations to perform POS tagging and produces state-of-the-art POS taggers for two languages. Expand
Detecting Semantically Equivalent Questions in Online User Forums
TLDR
The experimental results show that the convolutional neural network with in-domain word embeddings achieves high performance even with limited training data. Expand
...
1
2
3
4
5
...