# GraRep: Learning Graph Representations with Global Structural Information

@article{Cao2015GraRepLG, title={GraRep: Learning Graph Representations with Global Structural Information}, author={Shaosheng Cao and Wei Lu and Qiongkai Xu}, journal={Proceedings of the 24th ACM International on Conference on Information and Knowledge Management}, year={2015} }

In this paper, we present {GraRep}, a novel model for learning vertex representations of weighted graphs. [] Key Result Empirical results demonstrate that our representation significantly outperforms other state-of-the-art methods in such tasks.

## 1,304 Citations

### Representation Learning of Graphs Using Graph Convolutional Multilayer Networks Based on Motifs

- Computer ScienceNeurocomputing
- 2021

### Learning Structural Node Representations on Directed Graphs

- Computer ScienceCOMPLEX NETWORKS
- 2018

Although struc2vec++ is in most cases outperformed by the competing algorithm, experiments in a variety of different scenarios demonstrate that it is much more memory efficient and it can better capture structural roles in the presence of noise.

### SERL: Semantic-Path Biased Representation Learning of Heterogeneous Information Network

- Computer ScienceKSEM
- 2018

The SERL method formalizes the way to fuse different semantic paths during the random walk procedure when exploring the neighborhood of corresponding node and then leverages a heterogeneous skip-gram model to perform node embeddings.

### Learning Graph Representation: A Comparative Study

- Computer Science2018 International Arab Conference on Information Technology (ACIT)
- 2018

This paper summarizes the recent techniques and methods used for graph representation learning, and compared them together, to raise the need for comparing the existing methods in terms of methodology and techniques.

### GRAPHSAD: LEARNING GRAPH REPRESENTATIONS

- Computer Science
- 2020

This paper proposes to disentangle graph structure and node attributes into two distinct sets of representations, and such disentanglement can be done in either the input or the embedding space.

### A Time-Aware Inductive Representation Learning Strategy for Heterogeneous Graphs

- Computer Science
- 2019

This paper extends and improves existing models by enabling an edge-based transformation procedure in order to learn embeddings for different relations in heterogeneous graphs, and shows that by incorporating a sequential model to learn more expressive representations, temporal dynamics in social networks can be captured.

### A Structural Graph Representation Learning Framework

- Computer ScienceWSDM
- 2020

This work formulate higher-order network representation learning and describes a general framework called HONE for learning such structural node embeddings from networks via the subgraph patterns (network motifs, graphlet orbits/positions) in a nodes neighborhood.

### Walklets: Multiscale Graph Embeddings for Interpretable Network Classification

- Computer ScienceArXiv
- 2016

These representations clearly encode multiscale vertex relationships in a continuous vector space suitable for multi-label classification problems and outperforms new methods based on neural matrix factorization, and can scale to graphs with millions of vertices and edges.

### Deep Neural Networks for Learning Graph Representations

- Computer ScienceAAAI
- 2016

A novel model for learning graph representations, which generates a low-dimensional vector representation for each vertex by capturing the graph structural information directly, and which outperforms other stat-of-the-art models in such tasks.

### Network representation learning: A macro and micro view

- Computer ScienceAI Open
- 2021

## References

SHOWING 1-10 OF 33 REFERENCES

### LINE: Large-scale Information Network Embedding

- Computer ScienceWWW
- 2015

A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures.

### DeepWalk: online learning of social representations

- Computer ScienceKDD
- 2014

DeepWalk is an online learning algorithm which builds useful incremental results, and is trivially parallelizable, which make it suitable for a broad class of real world applications such as network classification, and anomaly detection.

### Distributed large-scale natural graph factorization

- Computer Science, MathematicsWWW
- 2013

This work proposes a novel factorization technique that relies on partitioning a graph so as to minimize the number of neighboring vertices rather than edges across partitions, and decomposition is based on a streaming algorithm.

### Learning Deep Representations for Graph Clustering

- Computer ScienceAAAI
- 2014

This work proposes a simple method, which first learns a nonlinear embedding of the original graph by stacked autoencoder, and then runs $k$-means algorithm on the embedding to obtain the clustering result, which significantly outperforms conventional spectral clustering.

### Relational learning via latent social dimensions

- Computer ScienceKDD
- 2009

This work proposes to extract latent social dimensions based on network information, and then utilize them as features for discriminative learning, and outperforms representative relational learning methods based on collective inference, especially when few labeled data are available.

### ArnetMiner: extraction and mining of academic social networks

- Computer ScienceKDD
- 2008

The architecture and main features of the ArnetMiner system, which aims at extracting and mining academic social networks, are described and a unified modeling approach to simultaneously model topical aspects of papers, authors, and publication venues is proposed.

### Visualizing Data using t-SNE

- Computer Science
- 2008

A new technique called t-SNE that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map, a variation of Stochastic Neighbor Embedding that is much easier to optimize, and produces significantly better visualizations by reducing the tendency to crowd points together in the center of the map.

### Neural Word Embedding as Implicit Matrix Factorization

- Computer ScienceNIPS
- 2014

It is shown that using a sparse Shifted Positive PMI word-context matrix to represent words improves results on two word similarity tasks and one of two analogy tasks, and conjecture that this stems from the weighted nature of SGNS's factorization.

### Extracting semantic representations from word co-occurrence statistics: stop-lists, stemming, and SVD

- Computer ScienceBehavior research methods
- 2012

This article investigates the use of three further factorsâ€”namely, the application of stop-lists, word stemming, and dimensionality reduction using singular value decomposition (SVD)â€”that have been used to provide improved performance elsewhere and introduces an additional semantic task and explores the advantages of using a much larger corpus.

### Normalized cuts and image segmentation

- Computer ScienceProceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition
- 1997

This work treats image segmentation as a graph partitioning problem and proposes a novel global criterion, the normalized cut, for segmenting the graph, which measures both the total dissimilarity between the different groups as well as the total similarity within the groups.