Author pages are created from data sourced from our academic publisher partnerships and public sources.
Share This Author
Inductive Representation Learning on Large Graphs
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Deep Graph Infomax
- Petar Velickovic, W. Fedus, William L. Hamilton, P. Lio’, Yoshua Bengio, R. Devon Hjelm
- Computer ScienceICLR
- 27 September 2018
Deep Graph Infomax (DGI) is presented, a general approach for learning node representations within graph-structured data in an unsupervised manner that is readily applicable to both transductive and inductive learning setups.
Hierarchical Graph Representation Learning with Differentiable Pooling
- Rex Ying, Jiaxuan You, Christopher Morris, Xiang Ren, William L. Hamilton, J. Leskovec
- Computer ScienceNeurIPS
- 22 June 2018
DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion.
Diachronic Word Embeddings Reveal Statistical Laws of Semantic Change
A robust methodology for quantifying semantic change is developed by evaluating word embeddings against known historical changes and it is revealed that words that are more polysemous have higher rates of semantic change.
Representation Learning on Graphs: Methods and Applications
A conceptual review of key advancements in this area of representation learning on graphs, including matrix factorization-based methods, random-walk based algorithms, and graph neural networks are provided.
Graph Convolutional Neural Networks for Web-Scale Recommender Systems
- Rex Ying, Ruining He, Kaifeng Chen, Pong Eksombatchai, William L. Hamilton, J. Leskovec
- Computer ScienceKDD
- 6 June 2018
A novel method based on highly efficient random walks to structure the convolutions and a novel training strategy that relies on harder-and-harder training examples to improve robustness and convergence of the model are developed.
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks
It is shown that GNNs have the same expressiveness as the Weisfeiler-Leman graph isomorphism heuristic in terms of distinguishing non-isomorphic (sub-)graphs, and a generalization of GNN's is proposed, so-called $k$-dimensional GNNS ($k-GNNs), which can take higher-order graph structures at multiple scales into account.
GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models
- Jiaxuan You, Rex Ying, Xiang Ren, William L. Hamilton, J. Leskovec
- Computer ScienceICML
- 24 February 2018
The experiments show that GraphRNN significantly outperforms all baselines, learning to generate diverse graphs that match the structural characteristics of a target set, while also scaling to graphs 50 times larger than previous deep models.
Cultural Shift or Linguistic Drift? Comparing Two Computational Measures of Semantic Change
Two different distributional measures can be used to detect two different types of semantic change, allowing researchers to determine whether changes are more cultural or linguistic in nature, a distinction that is essential for work in the digital humanities and historical linguistics.
Inducing Domain-Specific Sentiment Lexicons from Unlabeled Corpora
The approach achieves state-of-the-art performance on inducing sentiment lexicons from domain-specific corpora and that the purely corpus-based approach outperforms methods that rely on hand-curated resources (e.g., WordNet).