Attributed Graph Clustering: A Deep Attentional Embedding Approach

@inproceedings{Wang2019AttributedGC,
  title={Attributed Graph Clustering: A Deep Attentional Embedding Approach},
  author={Chun Wang and Shirui Pan and Ruiqi Hu and Guodong Long and Jing Jiang and Chengqi Zhang},
  booktitle={International Joint Conference on Artificial Intelligence},
  year={2019}
}
Graph clustering is a fundamental task which discovers communities or groups in networks. [] Key Method By employing an attention network to capture the importance of the neighboring nodes to a target node, our DAEGC algorithm encodes the topological structure and node content in a graph to a compact representation, on which an inner product decoder is trained to reconstruct the graph structure.

Figures and Tables from this paper

Attributed graph clustering with multi-task embedding learning.

Graph Learning for Attributed Graph Clustering

This work develops a shallow model for learning a fine-grained graph from smoothed data, which sufficiently exploits both node attributes and topology information and proposes a linear method with respect to node number n, which surpasses many recent deep learning approaches.

A Deep Graph Structured Clustering Network

A deep graph structured clustering network that applies a deep clustering method to graph structured data processing and achieves superior performance over state-of-the-art models is proposed.

Adaptive Graph Encoder for Attributed Graph Embedding

Experimental results show that AGE consistently outperforms state-of-the-art graph embedding methods considerably on node clustering and link prediction tasks, and the proposed Adaptive Graph Encoder employs an adaptive encoder that iteratively strengthens the filtered features for better node embeddings.

CAGNN: Cluster-Aware Graph Neural Networks for Unsupervised Graph Representation Learning

A novel cluster-aware graph neural network (CAGNN) model for unsupervised graph representation learning using self-supervised techniques, which gains over 7% improvements in terms of accuracy on node clustering over state-of-the-arts.

Structural Deep Clustering Network

A Structural Deep Clustering Network (SDCN) is proposed to integrate the structural information into deep clustering, with a delivery operator to transfer the representations learned by autoencoder to the corresponding GCN layer, and a dual self-supervised mechanism to unify these two different deep neural architectures and guide the update of the whole model.

Deep Adaptive Graph Clustering via von Mises-Fisher Distributions

Deep Adaptive Graph Clustering via von Mises-Fisher distributions is proposed, namely DAGC, which can simultaneously improve the intra-clusters compactness and inter-cluster heterogeneity.

Deep Graph Clustering via Dual Correlation Reduction

This work proposes a novel self-supervised deep graph clustering method termed Dual Correlation Reduction Network (DCRN) by reducing information correlation in a dual manner and introduces a propagation regularization term to enable the network to gain long-distance information with the shallow network structure.
...

References

SHOWING 1-10 OF 34 REFERENCES

MGAE: Marginalized Graph Autoencoder for Graph Clustering

A marginalized graph convolutional network is proposed to corrupt network node content, allowing node content to interact with network features, and marginalizes the corrupted features in a graph autoencoder context to learn graph feature representations.

Deep Graph Clustering in Social Network

Deep attributes residue graph algorithm (DARG), a novel model for learning deep representations of graph that can discover clusters by taking into consideration node relevance, is presented.

Deep Neural Networks for Learning Graph Representations

A novel model for learning graph representations, which generates a low-dimensional vector representation for each vertex by capturing the graph structural information directly, and which outperforms other stat-of-the-art models in such tasks.

Learning Deep Representations for Graph Clustering

This work proposes a simple method, which first learns a nonlinear embedding of the original graph by stacked autoencoder, and then runs $k$-means algorithm on the embedding to obtain the clustering result, which significantly outperforms conventional spectral clustering.

Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior…

Community Preserving Network Embedding

A novel Modularized Nonnegative Matrix Factorization (M-NMF) model is proposed to incorporate the community structure into network embedding and jointly optimize NMF based representation learning model and modularity based community detection model in a unified framework, which enables the learned representations of nodes to preserve both of the microscopic and community structures.

Learning Graph Embedding With Adversarial Training Methods

This article presents a novel adversarially regularized framework for graph embedding, employing the graph convolutional network as an encoder, that embeds the topological information and node content into a vector representation, from which a graph decoder is further built to reconstruct the input graph.

A Comprehensive Survey on Graph Neural Networks

This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.

Improved Deep Embedded Clustering with Local Structure Preservation

The Improved Deep Embedded Clustering (IDEC) algorithm is proposed, which manipulates feature space to scatter data points using a clustering loss as guidance and can jointly optimize cluster labels assignment and learn features that are suitable for clustering with local structure preservation.

Network Representation Learning with Rich Text Information

By proving that DeepWalk, a state-of-the-art network representation method, is actually equivalent to matrix factorization (MF), this work proposes text-associated DeepWalk (TADW), which incorporates text features of vertices into network representation learning under the framework of Matrix factorization.