• Publications
  • Influence
Inductive Representation Learning on Large Graphs
TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Deep Graph Infomax
TLDR
Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups.
Hierarchical Graph Representation Learning with Differentiable Pooling
TLDR
DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion.
Diachronic Word Embeddings Reveal Statistical Laws of Semantic Change
TLDR
A robust methodology for quantifying semantic change is developed by evaluating word embeddings against known historical changes and it is revealed that words that are more polysemous have higher rates of semantic change.
Representation Learning on Graphs: Methods and Applications
TLDR
A conceptual review of key advancements in this area of representation learning on graphs, including matrix factorization-based methods, random-walk based algorithms, and graph neural networks are provided.
Graph Convolutional Neural Networks for Web-Scale Recommender Systems
TLDR
A novel method based on highly efficient random walks to structure the convolutions and a novel training strategy that relies on harder-and-harder training examples to improve robustness and convergence of the model are developed.
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks
TLDR
It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.
GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models
TLDR
The experiments show that GraphRNN significantly outperforms all baselines, learning to generate diverse graphs that match the structural characteristics of a target set, while also scaling to graphs 50 times larger than previous deep models.
Cultural Shift or Linguistic Drift? Comparing Two Computational Measures of Semantic Change
TLDR
Two different distributional measures can be used to detect two different types of semantic change, allowing researchers to determine whether changes are more cultural or linguistic in nature, a distinction that is essential for work in the digital humanities and historical linguistics.
Inducing Domain-Specific Sentiment Lexicons from Unlabeled Corpora
TLDR
The approach achieves state-of-the-art performance on inducing sentiment lexicons from domain-specific corpora and that the purely corpus-based approach outperforms methods that rely on hand-curated resources (e.g., WordNet).
...
...