• Publications
  • Influence
Learning Entity and Relation Embeddings for Knowledge Graph Completion
TLDR
We propose TransR to build entity and relation embeddings in separate entity space and relation spaces. Expand
  • 1,218
  • 259
  • PDF
Network Representation Learning with Rich Text Information
TLDR
We prove that DeepWalk, a state-of-the-art network representation method, is actually equivalent to matrix factorization (MF), we propose text-associated DeepWalk (TADW). Expand
  • 558
  • 136
  • PDF
Neural Relation Extraction with Selective Attention over Instances
Distant supervised relation extraction has been widely used to find novel relational facts from text. However, distant supervision inevitably accompanies with the wrong labelling problem, and theseExpand
  • 527
  • 118
  • PDF
Modeling Relation Paths for Representation Learning of Knowledge Bases
TLDR
Representation learning of knowledge bases aims to embed both entities and relations into a low-dimensional space. Expand
  • 350
  • 90
  • PDF
A Unified Model for Word Sense Representation and Disambiguation
TLDR
In this paper, we present a unified model for joint word sense representation and disambiguation, which will assign distinct representations for each word sense. Expand
  • 284
  • 50
  • PDF
Representation Learning of Knowledge Graphs with Entity Descriptions
TLDR
We propose Description-Embodied Knowledge Representation Learning (DKRL), a novel RL method for knowledge graphs taking advantages of entity descriptions. Expand
  • 302
  • 47
  • PDF
Graph Neural Networks: A Review of Methods and Applications
TLDR
Graph neural networks are connectionist models that capture the dependence of graphs via message passing between the nodes of graphs. Expand
  • 531
  • 45
  • PDF
Minimum Risk Training for Neural Machine Translation
TLDR
We propose minimum risk training for end-to-end neural machine translation. Expand
  • 298
  • 42
  • PDF
Joint Learning of Character and Word Embeddings
TLDR
We propose a character-enhanced word embedding model to learn word embeddings according to words' external contexts, ignoring the internal structures of words. Expand
  • 194
  • 41
  • PDF
Topical Word Embeddings
TLDR
We employ latent topic models to assign topics for each word in the text corpus and learn topical word embeddings (TWE) based on both words and their topics. Expand
  • 290
  • 39
  • PDF