• Corpus ID: 211678297

Ego-based Entropy Measures for Structural Representations

@article{Dasoulas2020EgobasedEM,
  title={Ego-based Entropy Measures for Structural Representations},
  author={George Dasoulas and Giannis Nikolentzos and Kevin Scaman and Aladin Virmaux and Michalis Vazirgiannis},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.00553}
}
In complex networks, nodes that share similar structural characteristics often exhibit similar roles (e.g type of users in a social network or the hierarchical position of employees in a company). In order to leverage this relationship, a growing literature proposed latent representations that identify structurally equivalent nodes. However, most of the existing methods require high time and space complexity. In this paper, we propose VNEstruct, a simple approach for generating low-dimensional… 
Bridging the Gap between von Neumann Graph Entropy and Structural Information: Theory and Applications
TLDR
It is for the first time prove the entropy gap is between 0 and log 2e in any undirected unweighted graphs and certify that the structural information is a good approximation of VNGE that achieves provable accuracy, scalability, and interpretability simultaneously.
On the Similarity between von Neumann Graph Entropy and Structural Information: Interpretation, Computation, and Applications
TLDR
It is for the first time proved that the entropy gap is between 0 and log2 e in any undirected unweighted graphs, and it is certify that the structural information is a good approximation of the von Neumann graph entropy that achieves provable accuracy, scalability, and interpretability simultaneously.

References

SHOWING 1-10 OF 32 REFERENCES
Learning Structural Node Embeddings via Diffusion Wavelets
TLDR
GraphWave is developed, a method that represents each node's network neighborhood via a low-dimensional embedding by leveraging heat wavelet diffusion patterns and mathematically proves that nodes with similar network neighborhoods will have similar GraphWave embeddings even though these nodes may reside in very different parts of the network, and the method scales linearly with the number of edges.
struc2vec: Learning Node Representations from Structural Identity
TLDR
Struc2vec, a novel and flexible framework for learning latent representations for the structural identity of nodes, is presented, which improves performance on classification tasks that depend more on structural identity.
Deep Recursive Network Embedding with Regular Equivalence
TLDR
This work proposes a new approach named Deep Recursive Network Embedding (DRNE) to learn network embeddings with regular equivalence, and proposes a layer normalized LSTM to represent each node by aggregating the representations of their neighborhoods in a recursive way.
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.
Representation Learning on Graphs: Methods and Applications
TLDR
A conceptual review of key advancements in this area of representation learning on graphs, including matrix factorization-based methods, random-walk based algorithms, and graph neural networks are provided.
RolX: structural role extraction & mining in large graphs
TLDR
This paper proposes RolX (Role eXtraction), a scalable (linear in the number of edges), unsupervised learning approach for automatically extracting structural roles from general network data, and compares network role discovery with network community discovery.
Hierarchical Graph Representation Learning with Differentiable Pooling
TLDR
DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion.
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Capsule Graph Neural Network
TLDR
The Capsule Graph Neural Network (CapsGNN), which adopts the concept of capsules to address the weakness in existing GNN-based graph embeddings algorithms, has a powerful mechanism that operates to capture macroscopic properties of the whole graph by data-driven.
Weisfeiler-Lehman Graph Kernels
TLDR
A family of efficient kernels for large graphs with discrete node labels based on the Weisfeiler-Lehman test of isomorphism on graphs that outperform state-of-the-art graph kernels on several graph classification benchmark data sets in terms of accuracy and runtime.
...
1
2
3
4
...