Structure Regularized Neural Network for Entity Relation Classification for Chinese Literature Text

@inproceedings{Wen2018StructureRN,
  title={Structure Regularized Neural Network for Entity Relation Classification for Chinese Literature Text},
  author={Ji Wen and Xu Sun and Xuancheng Ren and Qi Su},
  booktitle={NAACL},
  year={2018}
}
Relation classification is an important semantic processing task in the field of natural language processing. In this paper, we propose the task of relation classification for Chinese literature text. A new dataset of Chinese literature text is constructed to facilitate the study in this task. We present a novel model, named Structure Regularized Bidirectional Recurrent Convolutional Neural Network (SR-BRCNN), to identify the relation between entities. The proposed model learns relation… Expand
Relation Classification via LSTMs Based on Sequence and Tree Structure
TLDR
A novel two-channel neural network architecture with attention mechanism that can make better use of the information contained in sentences and achieves impressive improvements on relation classification as compared with the existing methods is proposed. Expand
Entity Relation Extraction Based on Entity Indicators
TLDR
Task-related entity indicators are designed to enable a deep neural network to concentrate on the task-relevant information and achieve state-of-the-art performance on the ACE Chinese corpus, ACE English corpus and Chinese literature text corpus. Expand
Structurally Comparative Hinge Loss for Dependency-Based Neural Text Representation
TLDR
By substituting SCH loss for CE loss on various tasks, for both induced structures and structures from an external parser, performance is improved without additional learnable parameters, and the extent to which certain types of examples rely on the dependency structure can be measured directly by the learned margin, which results in better interpretability. Expand

References

SHOWING 1-10 OF 23 REFERENCES
Structure Regularized Bidirectional Recurrent Convolutional Neural Network for Relation Classification
  • Ji Wen
  • Computer Science
  • ArXiv
  • 2017
TLDR
A structure regularization model is proposed to learn relation representations along the SDP extracted from the forest formed by the structure regularized dependency tree, which benefits reducing the complexity of the whole model and helps improve the $F_{1}$ score by 10.3. Expand
Bidirectional Recurrent Convolutional Neural Network for Relation Classification
TLDR
A novel model BRCNN is presented to classify the relation of two entities in a sentence by combining convolutional neural networks and twochannel recurrent neural networks with long short term memory (LSTM) units and a bidirectional architecture is proposed. Expand
A Dependency-Based Neural Network for Relation Classification
TLDR
A new structure, termed augmented dependency path (ADP), is proposed, which is composed of the shortest dependency path between two entities and the subtrees attached to the shortest path, and a dependency-based neural networks (DepNN) are developed to exploit the semantic representation behind the ADP. Expand
Tag-Enhanced Tree-Structured Neural Networks for Implicit Discourse Relation Classification
TLDR
This work employs the Tree-LSTM model and Tree-GRU model, which is based on the tree structure, to encode the arguments in a relation and further leverage the constituent tags to control the semantic composition process in these tree-structured neural networks. Expand
A Discourse-Level Named Entity Recognition and Relation Extraction Dataset for Chinese Literature Text
TLDR
This paper proposes two tagging methods to solve the problem of data inconsistency, including a heuristic tagging method and a machine auxiliary tagging method, and builds a discourse-level dataset from hundreds of Chinese literature articles for improving this task. Expand
Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths
TLDR
This paper presents SDP-LSTM, a novel neural network to classify the relation of two entities in a sentence, which leverages the shortest dependency path (SDP) between two entities; multichannel recurrent neural networks, with long short term memory (L STM) units, pick up heterogeneous information along the SDP. Expand
Bidirectional Long Short-Term Memory Networks for Relation Classification
TLDR
Bidirectional long short-term memory networks (BLSTM) is proposed to model the sentence with complete, sequential information about all words, and features derived from the lexical resources such as WordNet or NLP systems such as dependency parser and named entity recognizers are used. Expand
Classifying Relations by Ranking with Convolutional Neural Networks
TLDR
This work proposes a new pairwise ranking loss function that makes it easy to reduce the impact of artificial classes and shows that it is more effective than CNN followed by a softmax classifier and using only word embeddings as input features is enough to achieve state-of-the-art results. Expand
Relation Classification via Multi-Level Attention CNNs
TLDR
A novel convolutional neural network architecture is proposed that enables endto-end learning from task-specific labeled data, forgoing the need for external knowledge such as explicit dependency structures and outperforms previous state-of-the-art methods. Expand
Relation Classification via Convolutional Deep Neural Network
TLDR
This paper exploits a convolutional deep neural network (DNN) to extract lexical and sentence level features from the output of pre-existing natural language processing systems and significantly outperforms the state-of-the-art methods. Expand
...
1
2
3
...