Relation Extraction: Perspective from Convolutional Neural Networks

@inproceedings{Nguyen2015RelationEP,
  title={Relation Extraction: Perspective from Convolutional Neural Networks},
  author={Thien Huu Nguyen and Ralph Grishman},
  booktitle={VS@HLT-NAACL},
  year={2015}
}
Up to now, relation extraction systems have made extensive use of features generated by linguistic analysis modules. [] Key Method Our model takes advantages of multiple window sizes for filters and pre-trained word embeddings as an initializer on a non-static architecture to improve the performance. We emphasize the relation extraction problem with an unbalanced corpus. The experimental results show that our system significantly outperforms not only the best baseline systems for relation extraction but also…

Figures and Tables from this paper

Relation Extraction via Position-Enhanced Convolutional Neural Network
  • Weiwei Shi, Sheng Gao
  • Computer Science
    2017 International Conference on Intelligent Environments (IE)
  • 2017
TLDR
Experimental results on wildly used datasets achieve considerable improvements on relation extraction as compared with baselines, which shows that the proposed position-enhanced embedding model can make full use of position information.
Combining Neural Networks and Log-linear Models to Improve Relation Extraction
TLDR
This paper proposes to combine the traditional feature-based method, the convolutional and recurrent neural networks to simultaneously benefit from their advantages and results in the state-of-the-art performance on the ACE 2005 and SemEval dataset.
Classifying Relation via Piecewise Convolutional Neural Networks with Transfer Learning
TLDR
A convolutional neural network is exploited to extract lexical and syntactic features and transfer learning approaches are applied for transferring the parameters of Convolutional layer pre-training on general-domain corpus to enable relation classification system to adapt resource-poor domains along with different relation type.
Distant Supervision for Relation Extraction via Retrieval-based Neural Networks
TLDR
A retrieval-based method based on piecewise convolutional neural networks (called RPCNN) for DS relation extraction, which calculate relation retrieval scores between input features and relation features before classification.
Relation Extraction Using Multi-Encoder LSTM Network on a Distant Supervised Dataset
TLDR
This work proposes a simple, yet effective, technique to automatically compute confidence levels of alignments of automatically labeled content using co-occurrence statistics of relations and dependency patterns of aligned sentences, and proposes a novel multi-encoder bidirectional Long Short Term Memory model to identify relations in a given sentence.
Chinese Relation Classification via Convolutional Neural Networks
TLDR
A novel convolutional neural network approach along shortest dependency paths (SDP) for Chinese relation classification is presented, first proposing a baseline end-to-end model that only takes sentence-level features, and then improving its performance by joint use of pre-extracted linguistic features.
Feature-Level Attention Based Sentence Encoding for Neural Relation Extraction
TLDR
This paper proposes a feature-level attention model to encode sentences, which tries to reveal the different effects of features for relation prediction, and demonstrates that scaled dot product attention is better than others.
A convolutional neural network method for relation classification
TLDR
Experimental results show that the method of random initialization of the position vector is unreasonable, and the method using the vector and the original position information of words performs better.
Modeling Multi-Granularity Hierarchical Features for Relation Extraction
TLDR
Experimental results show that the proposed novel method to extract multi-granularity features based solely on the original input sentences outperforms existing state-of-the-art models that even use external knowledge.
Effective Deep Memory Networks for Distant Supervised Relation Extraction
TLDR
A novel neural approach for distant supervised RE with special focus on attention mechanisms is introduced, which includes two major attention-based memory components, which are capable of explicitly capturing the importance of each context word for modeling the representation of the entity pair.
...
...

References

SHOWING 1-10 OF 51 REFERENCES
Relation Classification via Convolutional Deep Neural Network
TLDR
This paper exploits a convolutional deep neural network (DNN) to extract lexical and sentence level features from the output of pre-existing natural language processing systems and significantly outperforms the state-of-the-art methods.
Exploring Various Knowledge in Relation Extraction
TLDR
This paper investigates the incorporation of diverse lexical, syntactic and semantic knowledge in feature-based relation extraction using SVM and illustrates that the base phrase chunking information is very effective for relation extraction and contributes to most of the performance improvement from syntactic aspect while additional information from full parsing gives limited further enhancement.
Convolution Neural Network for Relation Extraction
TLDR
This paper proposes a novel convolution network, incorporating lexical features, applied to Relation Extraction, and compares the Convolution Neural Network CNN on relation extraction with the state-of-art tree kernel approach, including Typed Dependency Path Kernel and Shortest Dependency path Kernel and Context-Sensitive tree kernel.
A Convolutional Neural Network for Modelling Sentences
TLDR
A convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) is described that is adopted for the semantic modelling of sentences and induces a feature graph over the sentence that is capable of explicitly capturing short and long-range relations.
Distant supervision for relation extraction without labeled data
TLDR
This work investigates an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size.
Employing Word Representations and Regularization for Domain Adaptation of Relation Extraction
TLDR
This paper systematically explores various ways to apply word embeddings and clustering on adapting feature-based relation extraction systems and shows the best adaptation improvement by combining word cluster and word embedding information.
A Systematic Exploration of the Feature Space for Relation Extraction
TLDR
This paper systematically explore a large space of features for relation extraction and evaluates the effectiveness of different feature subspaces, and presents a general definition of feature spaces based on a graphic representation of relation instances.
Modeling Relations and Their Mentions without Labeled Text
TLDR
A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB.
Convolution Kernels on Constituent, Dependency and Sequential Structures for Relation Extraction
TLDR
This paper explores the use of innovative kernels based on syntactic and semantic structures for a target relation extraction task and illustrates that the combination of the above kernels achieves high effectiveness and significantly improves the current state-of-the-art.
Embedding Semantic Similarity in Tree Kernels for Domain Adaptation of Relation Extraction
TLDR
This paper proposes to combine term generalization approaches such as word clustering and latent semantic analysis (LSA) and structured kernels to improve the adaptability of relation extractors to new text genres/domains.
...
...