Share This Author
Attention-based LSTM for Aspect-level Sentiment Classification
This paper reveals that the sentiment polarity of a sentence is not only determined by the content but is also highly related to the concerned aspect, and proposes an Attention-based Long Short-Term Memory Network for aspect-level sentiment classification.
Emotional Chatting Machine: Emotional Conversation Generation with Internal and External Memory
This paper proposes Emotional Chatting Machine (ECM), the first work that addresses the emotion factor in large-scale conversation generation using three new mechanisms that respectively models the high-level abstraction of emotion expressions by embedding emotion categories.
Commonsense Knowledge Aware Conversation Generation with Graph Attention
- Hao Zhou, Tom Young, Minlie Huang, Haizhou Zhao, Jingfang Xu, Xiaoyan Zhu
- Computer ScienceIJCAI
- 1 July 2018
This is the first attempt that uses large-scale commonsense knowledge in conversation generation, and unlike existing models that use knowledge triples (entities) separately and independently, this model treats each knowledge graph as a whole, which encodes more structured, connected semantic information in the graphs.
Reinforcement Learning for Relation Classification From Noisy Data
Experiment results show that the proposed novel model can deal with the noise of data effectively and obtains better performance for relation classification at the sentence level.
Learning to Identify Review Spam
This paper exploits machine learning methods to identify review spam and provides a twoview semi-supervised method, co-training, to exploit the large amount of unlabeled data and shows that the proposed method is effective.
A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
A knowledge-enhanced pretraining model to utilize commonsense knowledge from external knowledge bases to generate reasonable stories that can generate more reasonable stories than state-of-the-art baselines, particularly in terms of logic and global coherence.
TransG : A Generative Model for Knowledge Graph Embedding
This paper proposes a novel generative model (TransG) to address the issue of multiple relation semantics that a relation may have multiple meanings revealed by the entity pairs associated with the corresponding triples.
Structure-Aware Review Mining and Summarization
This paper proposes a new machine learning framework based on Conditional Random Fields that can employ rich features to jointly extract positive opinions, negative opinions and object features for review sentences and shows that structure-aware models outperform many state-of-the-art approaches to review mining.
SSP: Semantic Space Projection for Knowledge Graph Embedding with Text Descriptions
This paper proposes the semantic space projection (SSP) model, a model which jointly learns from the symbolic triples and textual descriptions to discover semantic relevance and offer precise semantic embedding.
A Hierarchical Framework for Relation Extraction with Reinforcement Learning
This paper presents a novel paradigm to deal with relation extraction by regarding the related entities as the arguments of a relation and applies a hierarchical reinforcement learning (HRL) framework in this paradigm to enhance the interaction between entity mentions and relation types.