Corpus ID: 4755450

Inductive Representation Learning on Large Graphs

@inproceedings{Hamilton2017InductiveRL,
  title={Inductive Representation Learning on Large Graphs},
  author={William L. Hamilton and Zhitao Ying and Jure Leskovec},
  booktitle={NIPS},
  year={2017}
}
Low-dimensional embeddings of nodes in large graphs have proved extremely useful in a variety of prediction tasks, from content recommendation to identifying protein functions. [...] Key Method Instead of training individual embeddings for each node, we learn a function that generates embeddings by sampling and aggregating features from a node's local neighborhood. Our algorithm outperforms strong baselines on three inductive node-classification benchmarks: we classify the category of unseen nodes in evolving…Expand
Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking
TLDR
Graph2Gauss is proposed - an approach that can efficiently learn versatile node embeddings on large scale (attributed) graphs that show strong performance on tasks such as link prediction and node classification and the benefits of modeling uncertainty are demonstrated. Expand
Inductive Graph Embeddings through Locality Encodings
TLDR
This work proposes to use a set of basic predefined local encodings as the basis of a learning algorithm that generalizes well across unseen or distant regions in the network, both in unsupervised settings, when combined with language model learning, as well as in supervised tasks. Expand
Midterm Report
One of the central problems to graph mining and learning is generating representative embeddings for the nodes in a graph. Such embeddings have proven to be extremely useful in various identificationExpand
SURREAL: Subgraph Robust Representation Learning
TLDR
SURREAL is a novel, stable graph embedding algorithmic framework that leverages “spatio-electric” (SE) subgraphs: it learns graph representations using the analogy of graphs with electrical circuits, and preserves both local and global connectivity patterns, and addresses the issue of high-degree nodes that may incidentally connect a pair of nodes in a graph. Expand
Bayes EMbedding (BEM): Refining Representation by Integrating Knowledge Graphs and Behavior-specific Networks
TLDR
BEM is presented, a Bayesian framework that incorporates the information from knowledge graphs and behavior graphs and integrates them with the pre-trained embeddings from the behavior graphs via aBayesian generative model and is able to mutually refine the embeddeds from both sides while preserving their own topological structures. Expand
GVNP: Global Vectors for Node Representation
  • Dongao Zhang, Ziyang Chen, Peijia Zheng, Hongmei Liu
  • Computer Science
  • Advances in Artificial Intelligence and Security
  • 2021
TLDR
Inspired by node2vec, GVNP is proposed, an unsupervised method that can learn continuous feature representations for nodes and leverage node feature information to efficiently generate node embeddings for previously unseen data in networks. Expand
Meta-Inductive Node Classification across Graphs
TLDR
This paper proposes a novel meta-inductive framework called MI-GNN to customize the inductive model to each graph under a meta-learning paradigm, and employs a dual adaptation mechanism at both the graph and task levels. Expand
Dual Graph Representation Learning
TLDR
This paper presents a context-aware unsupervised dual encoding framework, CADE, to generate representations of nodes by combining real-time neighborhoods with neighbor-attentioned representation, and preserving extra memory of known nodes. Expand
Graph Auto-Encoders for Learning Edge Representations
TLDR
This paper proposes a new model (in the form of an auto-encoder) to learn edge embeddings in (un)directed graphs and empirically evaluates the approach in two different tasks, namely edge classification and link prediction. Expand
Survey on graph embeddings and their applications to machine learning problems on graphs
TLDR
This survey aims to describe the core concepts of graph embeddings and provide several taxonomies for their description, and presents an in-depth analysis of models based on network types, and overviews a wide range of applications to machine learning problems on graphs. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 44 REFERENCES
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods. Expand
Revisiting Semi-Supervised Learning with Graph Embeddings
TLDR
On a large and diverse set of benchmark tasks, including text classification, distantly supervised entity extraction, and entity classification, the proposed semi-supervised learning framework shows improved performance over many of the existing models. Expand
LINE: Large-scale Information Network Embedding
TLDR
A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures. Expand
GraRep: Learning Graph Representations with Global Structural Information
TLDR
A novel model for learning vertex representations of weighted graphs that integrates global structural information of the graph into the learning process and significantly outperforms other state-of-the-art methods in such tasks. Expand
Semi-Supervised Classification with Graph Convolutional Networks
TLDR
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin. Expand
Discriminative Embeddings of Latent Variable Models for Structured Data
TLDR
In applications involving millions of data points, it is shown that structure2vec runs 2 times faster, produces models which are 10, 000 times smaller, while at the same time achieving the state-of-the-art predictive performance. Expand
Gated Graph Sequence Neural Networks
TLDR
This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures. Expand
DeepWalk: online learning of social representations
TLDR
DeepWalk is an online learning algorithm which builds useful incremental results, and is trivially parallelizable, which make it suitable for a broad class of real world applications such as network classification, and anomaly detection. Expand
A new model for learning in graph domains
TLDR
A new neural model, called graph neural network (GNN), capable of directly processing graphs, which extends recursive neural networks and can be applied on most of the practically useful kinds of graphs, including directed, undirected, labelled and cyclic graphs. Expand
Variational Graph Auto-Encoders
TLDR
The variational graph auto-encoder (VGAE) is introduced, a framework for unsupervised learning on graph-structured data based on the variational auto- Encoder (VAE) that can naturally incorporate node features, which significantly improves predictive performance on a number of benchmark datasets. Expand
...
1
2
3
4
5
...