Deep Learning for Learning Graph Representations

@article{Zhu2019DeepLF,
  title={Deep Learning for Learning Graph Representations},
  author={Wenwu Zhu and Xin Wang and Peng Cui},
  journal={ArXiv},
  year={2019},
  volume={abs/2001.00293}
}
Mining graph data has become a popular research topic in computer science and has been widely studied in both academia and industry given the increasing amount of network data in the recent years. However, the huge amount of network data has posed great challenges for efficient analysis. This motivates the advent of graph representation which maps the graph into a low-dimension vector space, keeping original graph structure and supporting graph inference. The investigation on efficient… 

Figures, Tables, and Topics from this paper

Weakly-supervised learning for community detection based on graph convolution in attributed networks
TLDR
A weakly-supervised learning method based on GCN for community detection in attributed networks that integrates the techniques of GCN and label propagation and the latter constructs a balanced label set to uncover underlying community structures with topology and attribute information.
Unsupervised learning for community detection in attributed networks based on graph convolutional network
TLDR
Inspired by the message pass mechanism of GCN and the local self-organizing property of community structure, a label sampling model and GCN are integrated into an unsupervised learning framework to uncover underlying community structures by fusing topology and attribute information.
Graph Representation Learning via Contrasting Cluster Assignments
  • Chunyang Zhang, Hongyu Yao, C. L. Philip Chen, Yuena Lin
  • Computer Science
    ArXiv
  • 2021
TLDR
A novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA, that can keep a balanced aggregation of local and global information, and gets insight into the elusive association between nodes beyond graph topology.
Explainable Automated Graph Representation Learning with Hyperparameter Importance
TLDR
An explainable AutoML approach for graph representation (e-AutoGR) which utilizes explainable graph features during performance estimation and learns decorrelated importance weights for different hyperparameters in affecting the model performance through a non-linear decorrelated weighting regression is proposed.
Understanding in Artificial Intelligence
TLDR
How progress has been made in benchmark development to measure understanding capabilities of AI methods is shown and as well how current methods develop understanding capabilities are reviewed.
Graph Neural Networks: Taxonomy, Advances and Trends
TLDR
A novel taxonomy for the graph neural networks is provided, and up to 400 relevant literatures are referred to to show the panorama of the graph Neural networks.

References

SHOWING 1-10 OF 93 REFERENCES
GraRep: Learning Graph Representations with Global Structural Information
TLDR
A novel model for learning vertex representations of weighted graphs that integrates global structural information of the graph into the learning process and significantly outperforms other state-of-the-art methods in such tasks.
Structural Deep Embedding for Hyper-Networks
TLDR
It is theoretically prove that any linear similarity metric in embedding space commonly used in existing methods cannot maintain the indecomposibility property in hyper-networks, and thus a new deep model is proposed to realize a non-linear tuplewise similarity function while preserving both local and global proximities in the formedembedding space.
LINE: Large-scale Information Network Embedding
TLDR
A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures.
Learning Deep Representations for Graph Clustering
TLDR
This work proposes a simple method, which first learns a nonlinear embedding of the original graph by stacked autoencoder, and then runs k-means algorithm on the embedding to obtain clustering result, which significantly outperforms conventional spectral clustering.
Deep Recursive Network Embedding with Regular Equivalence
TLDR
This work proposes a new approach named Deep Recursive Network Embedding (DRNE) to learn network embeddings with regular equivalence, and proposes a layer normalized LSTM to represent each node by aggregating the representations of their neighborhoods in a recursive way.
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.
Higher order learning with graphs
TLDR
It is shown that various formulations of the semi-supervised and the unsupervised learning problem on hypergraphs result in the same graph theoretic problem and can be analyzed using existing tools.
DepthLGP: Learning Embeddings of Out-of-Sample Nodes in Dynamic Networks
TLDR
This work designs a high-order Laplacian Gaussian process (hLGP) to encode network properties, which permits fast and scalable inference, and designs a deep neural network to learn a nonlinear transformation from latent states of the hLGP to node embeddings.
Structural Deep Network Embedding
TLDR
This paper proposes a Structural Deep Network Embedding method, namely SDNE, which first proposes a semi-supervised deep model, which has multiple layers of non-linear functions, thereby being able to capture the highly non- linear network structure and exploits the first-order and second-order proximity jointly to preserve the network structure.
Cauchy Graph Embedding
TLDR
This paper proposes a novel Cauchy graph embedding which preserves the similarity relationships of the original data in the embedded space via a new objective and shows the usefulness of this new type of embedding on both synthetic and real world benchmark data sets.
...
1
2
3
4
5
...