Study on the Chinese Word Semantic Relation Classification with Word Embedding

@inproceedings{Shijia2017StudyOT,
  title={Study on the Chinese Word Semantic Relation Classification with Word Embedding},
  author={E. Shijia and Shengbin Jia and Yang Xiang},
  booktitle={NLPCC},
  year={2017}
}
This paper describes our solution to the NLPCC 2017 shared task on Chinese word semantic relation classification. Our proposed method won second place for this task. The evaluation result of our method on the test set is 76.8% macro F1 on the four types of semantic relation classification, i.e., synonym, antonym, hyponym, and meronym. In our experiments, we try basic word embedding, linear regression and convolutional neural networks (CNNs) with the pre-trained word embedding. The experimental… Expand
4 Citations
A Deep Learning Baseline for the Classification of Chinese Word Semantic Relations
TLDR
This paper designs various combinations of deep learning models and features and proposes a joint model based on convolutional neural network and highway network that has reached a f1 value of 0.58 and outperform all the other deep learning Models now available. Expand
Overview of the NLPCC 2017 Shared Task: Chinese Word Semantic Relation Classification
TLDR
The data construction and experimental setting is described, an analysis on the evaluation results is made, and a brief introduction to some of the participating systems are made. Expand
A Classification Method for Chinese Word Semantic Relations Based on TF-IDF and CNN
TLDR
A classification method for Chinese word semantic relations based on TF-IDF and CNN is presented, and four new literal features are proposed including whether a word is part of another word and the ratio of their common substring. Expand
Ontological Relation Classification Using WordNet, Word Embeddings and Deep Neural Networks
TLDR
A novel way to exploit WordNet, the combination of pre-trained word embeddings and deep neural networks for the task of ontological relation classification is introduced, which should help the ontology learning research community develop tools for ontology relation extraction. Expand

References

SHOWING 1-10 OF 21 REFERENCES
Overview of the NLPCC 2017 Shared Task: Chinese Word Semantic Relation Classification
TLDR
The data construction and experimental setting is described, an analysis on the evaluation results is made, and a brief introduction to some of the participating systems are made. Expand
Classifying Relations by Ranking with Convolutional Neural Networks
TLDR
This work proposes a new pairwise ranking loss function that makes it easy to reduce the impact of artificial classes and shows that it is more effective than CNN followed by a softmax classifier and using only word embeddings as input features is enough to achieve state-of-the-art results. Expand
Relation Classification via Convolutional Deep Neural Network
TLDR
This paper exploits a convolutional deep neural network (DNN) to extract lexical and sentence level features from the output of pre-existing natural language processing systems and significantly outperforms the state-of-the-art methods. Expand
Convolutional Neural Networks for Sentence Classification
TLDR
The CNN models discussed herein improve upon the state of the art on 4 out of 7 tasks, which include sentiment analysis and question classification, and are proposed to allow for the use of both task-specific and static vectors. Expand
Analogy-based detection of morphological and semantic relations with word embeddings: what works and what doesn’t.
TLDR
This study applies the widely used vector offset method to 4 types of linguistic relations: inflectional and derivational morphology, and lexicographic and encyclopedic semantics, and systematically examines how accuracy for different categories is affected by window size and dimensionality of the SVD-based word embeddings. Expand
A Convolutional Neural Network for Modelling Sentences
TLDR
A convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) is described that is adopted for the semantic modelling of sentences and induces a feature graph over the sentence that is capable of explicitly capturing short and long-range relations. Expand
GloVe: Global Vectors for Word Representation
TLDR
A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure. Expand
Take and Took, Gaggle and Goose, Book and Read: Evaluating the Utility of Vector Differences for Lexical Relation Learning
TLDR
It is found that word embeddings capture a surprising amount of information, and that, under suitable supervised training, vector subtraction generalises well to a broad range of relations, including over unseen lexical items. Expand
Efficient Estimation of Word Representations in Vector Space
TLDR
Two novel model architectures for computing continuous vector representations of words from very large data sets are proposed and it is shown that these vectors provide state-of-the-art performance on the authors' test set for measuring syntactic and semantic word similarities. Expand
SemEval-2014 Task 3: Cross-Level Semantic Similarity
This paper introduces a new SemEval task on Cross-Level Semantic Similarity (CLSS), which measures the degree to which the meaning of a larger linguistic item, such as a paragraph, is captured by aExpand
...
1
2
3
...