GraRep: Learning Graph Representations with Global Structural Information

@article{Cao2015GraRepLG,
  title={GraRep: Learning Graph Representations with Global Structural Information},
  author={Shaosheng Cao and Wei Lu and Qiongkai Xu},
  journal={Proceedings of the 24th ACM International on Conference on Information and Knowledge Management},
  year={2015}
}
  • Shaosheng Cao, Wei Lu, Qiongkai Xu
  • Published 17 October 2015
  • Computer Science
  • Proceedings of the 24th ACM International on Conference on Information and Knowledge Management
In this paper, we present {GraRep}, a novel model for learning vertex representations of weighted graphs. [...] Key Result Empirical results demonstrate that our representation significantly outperforms other state-of-the-art methods in such tasks.Expand
Representation Learning of Reconstructed Graphs Using Random Walk Graph Convolutional Network
TLDR
It is believed that combining high-order local structural information can more efficiently explore the potential of the network, which will greatly improve the learning efficiency of graph neural network and promote the establishment of new learning models. Expand
Representation Learning of Graphs Using Graph Convolutional Multilayer Networks Based on Motifs
TLDR
This work proposes mGCMN -- a novel framework which utilizes node feature information and the higher order local structure of the graph to effectively generate node embeddings for previously unseen data. Expand
Learning Structural Node Representations on Directed Graphs
TLDR
Although struc2vec++ is in most cases outperformed by the competing algorithm, experiments in a variety of different scenarios demonstrate that it is much more memory efficient and it can better capture structural roles in the presence of noise. Expand
SERL: Semantic-Path Biased Representation Learning of Heterogeneous Information Network
TLDR
The SERL method formalizes the way to fuse different semantic paths during the random walk procedure when exploring the neighborhood of corresponding node and then leverages a heterogeneous skip-gram model to perform node embeddings. Expand
Learning Graph Representation: A Comparative Study
  • W. Etaiwi, A. Awajan
  • Computer Science
  • 2018 International Arab Conference on Information Technology (ACIT)
  • 2018
TLDR
This paper summarizes the recent techniques and methods used for graph representation learning, and compared them together, to raise the need for comparing the existing methods in terms of methodology and techniques. Expand
GRAPHSAD: LEARNING GRAPH REPRESENTATIONS
Graph Neural Networks (GNNs) learn effective node/graph representations by aggregating the attributes of neighboring nodes, which commonly derives a single representation mixing the information ofExpand
A Time-Aware Inductive Representation Learning Strategy for Heterogeneous Graphs
Graphs are versatile data structures that have permeated a large number of application fields, such as biochemistry, knowledge graphs, and social networks. As a result, different graph representationExpand
A Structural Graph Representation Learning Framework
TLDR
This work formulate higher-order network representation learning and describes a general framework called HONE for learning such structural node embeddings from networks via the subgraph patterns (network motifs, graphlet orbits/positions) in a nodes neighborhood. Expand
Walklets: Multiscale Graph Embeddings for Interpretable Network Classification
TLDR
These representations clearly encode multiscale vertex relationships in a continuous vector space suitable for multi-label classification problems and outperforms new methods based on neural matrix factorization, and can scale to graphs with millions of vertices and edges. Expand
Deep Neural Networks for Learning Graph Representations
TLDR
A novel model for learning graph representations, which generates a low-dimensional vector representation for each vertex by capturing the graph structural information directly, and which outperforms other stat-of-the-art models in such tasks. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 35 REFERENCES
LINE: Large-scale Information Network Embedding
TLDR
A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures. Expand
DeepWalk: online learning of social representations
TLDR
DeepWalk is an online learning algorithm which builds useful incremental results, and is trivially parallelizable, which make it suitable for a broad class of real world applications such as network classification, and anomaly detection. Expand
Distributed large-scale natural graph factorization
TLDR
This work proposes a novel factorization technique that relies on partitioning a graph so as to minimize the number of neighboring vertices rather than edges across partitions, and decomposition is based on a streaming algorithm. Expand
Relational learning via latent social dimensions
TLDR
This work proposes to extract latent social dimensions based on network information, and then utilize them as features for discriminative learning, and outperforms representative relational learning methods based on collective inference, especially when few labeled data are available. Expand
Learning Deep Representations for Graph Clustering
TLDR
This work proposes a simple method, which first learns a nonlinear embedding of the original graph by stacked autoencoder, and then runs k-means algorithm on the embedding to obtain clustering result, which significantly outperforms conventional spectral clustering. Expand
ArnetMiner: extraction and mining of academic social networks
TLDR
The architecture and main features of the ArnetMiner system, which aims at extracting and mining academic social networks, are described and a unified modeling approach to simultaneously model topical aspects of papers, authors, and publication venues is proposed. Expand
GloVe: Global Vectors for Word Representation
TLDR
A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure. Expand
Visualizing Data using t-SNE
We present a new technique called “t-SNE” that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map. The technique is a variation of StochasticExpand
Distributed Representations of Words and Phrases and their Compositionality
TLDR
This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling. Expand
Neural Word Embedding as Implicit Matrix Factorization
TLDR
It is shown that using a sparse Shifted Positive PMI word-context matrix to represent words improves results on two word similarity tasks and one of two analogy tasks, and conjecture that this stems from the weighted nature of SGNS's factorization. Expand
...
1
2
3
4
...