• Publications
  • Influence
Hierarchical Graph Representation Learning with Differentiable Pooling
TLDR
We propose a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion. Expand
  • 480
  • 108
  • PDF
GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models
TLDR
GraphRNN learns to generate graphs by training on a representative set of graphs and decomposes the graph generation process into a sequence of node and edge formations, conditioned on the graph structure generated. Expand
  • 224
  • 42
  • PDF
Dynamic Network Embedding by Modeling Triadic Closure Process
TLDR
We present a novel representation learning approach, DynamicTriad, to preserve both structural information and evolution patterns of a given network. Expand
  • 150
  • 28
  • PDF
AFET: Automatic Fine-Grained Entity Typing by Hierarchical Partial-Label Embedding
TLDR
This paper proposes a novel embedding method to separately model “clean” and “noisy” mentions, and incorporates the given type hierarchy to induce loss functions. Expand
  • 71
  • 18
  • PDF
CoType: Joint Extraction of Typed Entities and Relations with Knowledge Bases
TLDR
We propose a novel domain-independent framework, called CoType, that runs a data-driven text segmentation algorithm to extract entity mentions, relation mentions, text features and type labels into two low-dimensional spaces, where, in each space, objects whose types are close will also have similar representations. Expand
  • 127
  • 17
  • PDF
GraphRNN: A Deep Generative Model for Graphs
TLDR
We propose GraphRNN, a deep autoregressive model that addresses the above challenges and approximates any distribution of graphs with minimal assumptions about their structure. Expand
  • 74
  • 17
Label Noise Reduction in Entity Typing by Heterogeneous Partial-Label Embedding
TLDR
We propose a general framework, called PLE, to jointly embed entity mentions, text features and entity types into the same low-dimensional space where, in that space, objects whose types are semantically close have similar representations. Expand
  • 83
  • 16
  • PDF
An Attention-based Collaboration Framework for Multi-View Network Representation Learning
TLDR
This paper studies learning node representations for networks with multiple views, which aims to infer robust node representations across different views. Expand
  • 73
  • 14
  • PDF
Mining Quality Phrases from Massive Text Corpora
TLDR
We propose a new framework that extracts quality phrases from text corpora integrated with phrasal segmentation. Expand
  • 135
  • 11
  • PDF
KagNet: Knowledge-Aware Graph Networks for Commonsense Reasoning
TLDR
We propose a knowledge-aware reasoning framework for answering commonsense questions, which effectively utilizes external, structured commonsense knowledge graphs to perform explainable inferences. Expand
  • 74
  • 10
  • PDF