• Corpus ID: 12873739

Relation Classification via Convolutional Deep Neural Network

@inproceedings{Zeng2014RelationCV,
  title={Relation Classification via Convolutional Deep Neural Network},
  author={Daojian Zeng and Kang Liu and Siwei Lai and Guangyou Zhou and Jun Zhao},
  booktitle={COLING},
  year={2014}
}
The state-of-the-art methods used for relation classification are primarily based on statistical machine learning, and their performance strongly depends on the quality of the extracted features. [] Key Method Our method takes all of the word tokens as input without complicated pre-processing. First, the word tokens are transformed to vectors by looking up word embeddings 1 . Then, lexical level features are extracted according to the given nouns.

Figures and Tables from this paper

A Convolutional Deep Neural Network for Coreference Resolution via Modeling Hierarchical Features
TLDR
This work explored a convolutional deep neural network CDNN to extract discourse level features automatically and obtains a competitive score of average F1 over MUC, B3, and CEAF, which places it above the mean score of other systems on the dataset of CoNLL-2012 Shared Task.
Chinese Relation Classification via Convolutional Neural Networks
TLDR
A novel convolutional neural network approach along shortest dependency paths (SDP) for Chinese relation classification is presented, first proposing a baseline end-to-end model that only takes sentence-level features, and then improving its performance by joint use of pre-extracted linguistic features.
Embedding Syntactic Tree Structures into CNN Architecture for Relation Classification
TLDR
This paper proposes a new convolutional neural network (CNN) architecture which combines the syntactic tree structure and other lexical level features together for relation classification and demonstrates that this method outperforms previous state-of-the-art methods under the condition of without using external linguistic resources like WordNet.
A convolutional neural network method for relation classification
TLDR
Experimental results show that the method of random initialization of the position vector is unreasonable, and the method using the vector and the original position information of words performs better.
Relation classification using revised convolutional neural networks
TLDR
This work introduces the hierarchical convolutional layers and dependency embedding to the CNN based relation classification methods and proves that the revised relation classification method provide state-of-the-art performance, even without additional artificial features.
Improving the Relation Classification Using Convolutional Neural Network
TLDR
This work designed a framework to automatically extract the relations between the entities using deep learning techniques and shows improved classification accuracy in comparison with the state-of-the-art methodologies using appropriate methods and also including the additional relations.
Relation Extraction via Position-Enhanced Convolutional Neural Network
  • Weiwei Shi, Sheng Gao
  • Computer Science
    2017 International Conference on Intelligent Environments (IE)
  • 2017
TLDR
Experimental results on wildly used datasets achieve considerable improvements on relation extraction as compared with baselines, which shows that the proposed position-enhanced embedding model can make full use of position information.
BiTCNN: A Bi-Channel Tree Convolution Based Neural Network Model for Relation Classification
TLDR
A bi-channel tree convolution based neural network model, BiTCNN, which combines syntactic tree features and other lexical level features together in a deeper manner for relation classification, which achieves better relation classification results compared with other state-of-the-art methods.
Classifying Relations by Ranking with Convolutional Neural Networks
TLDR
This work proposes a new pairwise ranking loss function that makes it easy to reduce the impact of artificial classes and shows that it is more effective than CNN followed by a softmax classifier and using only word embeddings as input features is enough to achieve state-of-the-art results.
...
...

References

SHOWING 1-10 OF 24 REFERENCES
Modeling Relations and Their Mentions without Labeled Text
TLDR
A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB.
Natural Language Processing (Almost) from Scratch
We propose a unified neural network architecture and learning algorithm that can be applied to various natural language processing tasks including part-of-speech tagging, chunking, named entity
Distant supervision for relation extraction without labeled data
TLDR
This work investigates an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size.
Semantic Compositionality through Recursive Matrix-Vector Spaces
TLDR
A recursive neural network model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length and can learn the meaning of operators in propositional logic and natural language is introduced.
Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations
TLDR
A novel approach for multi-instance learning with overlapping relations that combines a sentence-level extraction model with a simple, corpus-level component for aggregating the individual facts is presented.
Simple Customization of Recursive Neural Networks for Semantic Relation Classification
TLDR
The proposed RNN model allows for an explicit weighting of important phrases for the target task and is competitive with state-of-the-art RNN-based models.
Exploring Various Knowledge in Relation Extraction
TLDR
This paper investigates the incorporation of diverse lexical, syntactic and semantic knowledge in feature-based relation extraction using SVM and illustrates that the base phrase chunking information is very effective for relation extraction and contributes to most of the performance improvement from syntactic aspect while additional information from full parsing gives limited further enhancement.
Reducing Wrong Labels in Distant Supervision for Relation Extraction
TLDR
A novel generative model is presented that directly models the heuristic labeling process of distant supervision and predicts whether assigned labels are correct or wrong via its hidden variables.
Word Representations: A Simple and General Method for Semi-Supervised Learning
TLDR
This work evaluates Brown clusters, Collobert and Weston (2008) embeddings, and HLBL (Mnih & Hinton, 2009) embeds of words on both NER and chunking, and finds that each of the three word representations improves the accuracy of these baselines.
Combining linguistic and statistical analysis to extract relations from web documents
TLDR
It is shown that this approach profits significantly when deep linguistic structures are used instead of surface text patterns, and the benefits of this approach are shown by extensive experiments with the prototype system LEILA.
...
...