• Publications
  • Influence
Learning Entity and Relation Embeddings for Knowledge Graph Completion
TLDR
TransR is proposed to build entity and relation embeddings in separate entity space and relation spaces to build translations between projected entities and to evaluate the models on three tasks including link prediction, triple classification and relational fact extraction. Expand
Network Representation Learning with Rich Text Information
TLDR
By proving that DeepWalk, a state-of-the-art network representation method, is actually equivalent to matrix factorization (MF), this work proposes text-associated DeepWalk (TADW), which incorporates text features of vertices into network representation learning under the framework of Matrix factorization. Expand
Neural Relation Extraction with Selective Attention over Instances
TLDR
A sentence-level attention-based model for relation extraction that employs convolutional neural networks to embed the semantics of sentences and dynamically reduce the weights of those noisy instances. Expand
Modeling Relation Paths for Representation Learning of Knowledge Bases
TLDR
This model considers relation paths as translations between entities for representation learning, and addresses two key challenges: (1) Since not all relation paths are reliable, it design a path-constraint resource allocation algorithm to measure the reliability of relation paths and (2) represents relation paths via semantic composition of relation embeddings. Expand
Graph Neural Networks: A Review of Methods and Applications
TLDR
A detailed review over existing graph neural network models is provided, systematically categorize the applications, and four open problems for future research are proposed. Expand
ERNIE: Enhanced Language Representation with Informative Entities
TLDR
This paper utilizes both large-scale textual corpora and KGs to train an enhanced language representation model (ERNIE) which can take full advantage of lexical, syntactic, and knowledge information simultaneously, and is comparable with the state-of-the-art model BERT on other common NLP tasks. Expand
Minimum Risk Training for Neural Machine Translation
TLDR
Experiments show that the proposed minimum risk training approach achieves significant improvements over maximum likelihood estimation on a state-of-the-art neural machine translation system across various languages pairs. Expand
Representation Learning of Knowledge Graphs with Entity Descriptions
TLDR
Experimental results on real-world datasets show that, the proposed novel RL method for knowledge graphs outperforms other baselines on the two tasks, especially under the zero-shot setting, which indicates that the method is capable of building representations for novel entities according to their descriptions. Expand
A Unified Model for Word Sense Representation and Disambiguation
TLDR
A unified model for joint word sense representation and disambiguation, which will assign distinct representations for each word sense and improves the performance of contextual word similarity compared to existing WSR methods, outperforms state-of-the-art supervised methods on domainspecific WSD, and achieves competitive performance on coarse-grained all-words WSD. Expand
FewRel: A Large-Scale Supervised Few-Shot Relation Classification Dataset with State-of-the-Art Evaluation
TLDR
Empirical results show that even the most competitive few- shot learning models struggle on this task, especially as compared with humans, and indicate that few-shot relation classification remains an open problem and still requires further research. Expand
...
1
2
3
4
5
...