• Publications
  • Influence
Relation Classification via Convolutional Deep Neural Network
TLDR
This paper exploits a convolutional deep neural network (DNN) to extract lexical and sentence level features from the output of pre-existing natural language processing systems and significantly outperforms the state-of-the-art methods.
Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks
TLDR
This paper proposes a novel model dubbed the Piecewise Convolutional Neural Networks (PCNNs) with multi-instance learning to address the problem of wrong label problem when using distant supervision for relation extraction and adopts convolutional architecture with piecewise max pooling to automatically learn relevant features.
Recurrent Convolutional Neural Networks for Text Classification
TLDR
A recurrent convolutional neural network is introduced for text classification without human-designed features to capture contextual information as far as possible when learning word representations, which may introduce considerably less noise compared to traditional window-based neural networks.
Knowledge Graph Embedding via Dynamic Mapping Matrix
TLDR
A more fine-grained model named TransD, which is an improvement of TransR/CTransR, which not only considers the diversity of relations, but also entities, which makes it can be applied on large scale graphs.
Event Extraction via Dynamic Multi-Pooling Convolutional Neural Networks
TLDR
A word-representation model to capture meaningful semantic regularities for words and a framework based on a convolutional neural network to capture sentence-level clues are introduced.
Extracting Relational Facts by an End-to-End Neural Model with Copy Mechanism
TLDR
This paper proposes an end-to-end model based on sequence- to-sequence learning with copy mechanism, which can jointly extract relational facts from sentences of any of these classes, including Normal, EntityPairOverlap and SingleEntiyOverlap.
Inner Attention based Recurrent Neural Networks for Answer Selection
TLDR
This work presents three new RNN models that add attention information before RNN hidden representation, which shows advantage in representing sentence and achieves new state-of-art results in answer selection task.
Knowledge Graph Completion with Adaptive Sparse Transfer Matrix
TLDR
Experimental results show that TranSparse outperforms Trans(E, H, R, and D) significantly, and achieves state-of-the-art performance on triplet classification and link prediction tasks.
Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions
TLDR
This paper proposes a sentence-level attention model to select the valid instances, which makes full use of the supervision information from knowledge bases, and extracts entity descriptions from Freebase and Wikipedia pages to supplement background knowledge for the authors' task.
DCFEE: A Document-level Chinese Financial Event Extraction System based on Automatically Labeled Training Data
TLDR
A Document-level Chinese Financial Event Extraction (DCFEE) system which can automatically generate a large scaled labeled data and extract events from the whole document is proposed.
...
...