• Publications
  • Influence
Interactive Attention Networks for Aspect-Level Sentiment Classification
TLDR
We propose the interactive attention networks (IAN) to interactively learn attentions in the contexts and targets, and generate the representations for targets and contexts separately. Expand
  • 326
  • 78
  • PDF
基於《知網》的辭彙語義相似度計算 (Word Similarity Computing Based on How-net)
  • Qun Liu, Sujian Li
  • Computer Science
  • Int. J. Comput. Linguistics Chin. Lang. Process.
  • 2002
TLDR
Word similarity is broadly used in many applications, such as information retrieval, information extraction, text classification, word sense disambiguation, example -based machine translation, etc. Expand
  • 247
  • 28
  • PDF
Applying regression models to query-focused multi-document summarization
TLDR
We use Support Vector Regression (SVR) to estimate the importance of a sentence in a document set to be summarized through a set of pre-defined features. Expand
  • 137
  • 18
A Two-Stage Parsing Method for Text-Level Discourse Analysis
TLDR
We propose to use the transition-based model to parse the naked discourse tree (i.e., identifying span and nuclearity) due to data sparsity. Expand
  • 43
  • 18
  • PDF
A Dependency-Based Neural Network for Relation Classification
TLDR
We propose a new structure, termed augmented dependency path (ADP), which is composed of shortest dependency path between two entities and the subtrees attached to the shortest path. Expand
  • 176
  • 16
  • PDF
Text-level Discourse Dependency Parsing
TLDR
In this paper, we present the limitations of constituency based discourse parsing and first propose to use dependency structure to directly represent the relations between elementary discourse units (EDUs). Expand
  • 62
  • 16
  • PDF
Faithful to the Original: Fact Aware Neural Abstractive Summarization
TLDR
We focus on an increasingly intriguing task, i.e., abstractive sentence summarization (Rush, Chopra, and Weston 2015a) which generates a shorter version of a given sentence while attempting to preserve its original meaning. Expand
  • 100
  • 15
  • PDF
Ranking with Recursive Neural Networks and Its Application to Multi-Document Summarization
TLDR
We develop a Ranking framework upon Recursive Neural Networks (R2N2) to rank sentences for multi-document summarization. Expand
  • 167
  • 14
  • PDF
A Novel Neural Topic Model and Its Supervised Extension
TLDR
We propose a novel neural topic model (sNTM) where the representation of words and documents are efficiently and naturally combined into a uniform framework. Expand
  • 98
  • 13
  • PDF
Learning Summary Prior Representation for Extractive Summarization
TLDR
We propose the concept of summary prior to define how much a sentence is appropriate to be selected into summary without consideration of its context. Expand
  • 89
  • 13
  • PDF