• Corpus ID: 219558774

Node Embeddings and Exact Low-Rank Representations of Complex Networks

@article{Chanpuriya2020NodeEA,
  title={Node Embeddings and Exact Low-Rank Representations of Complex Networks},
  author={Sudhanshu Chanpuriya and Cameron Musco and Konstantinos Sotiropoulos and Charalampos E. Tsourakakis},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.05592}
}
Low-dimensional embeddings, from classical spectral embeddings to modern neural-net-inspired methods, are a cornerstone in the modeling and analysis of complex networks. Recent work by Seshadhri et al. (PNAS 2020) suggests that such embeddings cannot capture local structure arising in complex networks. In particular, they show that any network generated from a natural low-dimensional model cannot be both sparse and have high triangle density (high clustering coefficient), two hallmark… 

Figures and Tables from this paper

Exact Representation of Sparse Networks with Symmetric Nonnegative Embeddings

A new bound for the LPCA model is proved in terms of arboricity rather than max degree; this greatly increases the bound’s applicability to many sparse real-world networks.

Implications of sparsity and high triangle density for graph representation learning

This work shows that sparse graphs containing many triangles can be reproduced using an in-dimensional inner product model, where the node representations lie on a low-dimensional manifold, evidence against the common perception that triangles imply community structure.

An Attract-Repel Decomposition of Undirected Networks

The attract-repel (AR) decomposition is demonstrated in real social networks and it can be used to measure the amount of latent homophily and heterophily, and applied to co-occurrence networks to discover roles in teams and find substitutable ingredients in recipes.

Low-Rank Representations Towards Classification Problem of Complex Networks

This work studies the performance of low-rank representations of real-life networks on a network classification problem using Euclidean embedding of the vertices of the network.

Low-Rank Representations Towards Classification Problem of Complex Networks

This work studies the performance of low-rank representations of real-life networks on a network classification problem using Euclidean embedding of vertices of the network.

Similarity-based Link Prediction from Modular Compression of Network Flows

The proposed MapSim, a novel information-theoretic approach to assess node similarities based on modular compression of network flows, demonstrates the potential of compression-based approaches in graph representation learning, with promising applications in other graph learning tasks.

A Multi-source Graph Representation of the Movie Domain for Recommendation Dialogues Analysis

An integrated graph-based structure of multiple resources, enriched with the results of the application of graph analytics approaches is proposed to provide an encompassing view of the domain and of the way people talk about it during the recommendation task.

A Hierarchical Block Distance Model for Ultra Low-Dimensional Graph Representations

Surprisingly, the proposed HBDM framework outperforms recent scalable approaches in all considered downstream tasks and is observed to observe superior performance even imposing ultra-low two-dimensional embeddings facilitating accurate direct and hierarchical-aware network visualization and interpretation.

A Survey on Graph Representation Learning Methods

An overview of non-GNN graph embedding methods, which are based on techniques such as random walks, temporal point processes and neural network learning methods, and GNN-based methods which are the application of deep learning on graph data are provided.

Classic Graph Structural Features Outperform Factorization-Based Graph Embedding Methods on Community Labeling

It is formally prove that popular low-dimensional factorization methods either cannot produce community structure, or can only produce “unstable” communities.

References

SHOWING 1-10 OF 76 REFERENCES

The impossibility of low-rank representations for triangle-rich complex networks

It is mathematically proved that low-dimensional embeddings cannot generate graphs with both low average degree and large clustering coefficients, and any embedding that can successfully create these two properties must have a rank that is nearly linear in the number of vertices.

InfiniteWalk: Deep Network Embeddings as Laplacian Embeddings with a Nonlinearity

Surprisingly, it is found that even simple binary thresholding of the Laplacian pseudoinverse is often competitive, suggesting that the core advancement of recent methods is a nonlinearity on top of the classical spectral embedding approach.

Inductive Representation Learning on Large Graphs

GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.

Deep Neural Networks for Learning Graph Representations

A novel model for learning graph representations, which generates a low-dimensional vector representation for each vertex by capturing the graph structural information directly, and which outperforms other stat-of-the-art models in such tasks.

LINE: Large-scale Information Network Embedding

A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures.

Network Embedding as Matrix Factorization: Unifying DeepWalk, LINE, PTE, and node2vec

The NetMF method offers significant improvements over DeepWalk and LINE for conventional network mining tasks and provides the theoretical connections between skip-gram based network embedding algorithms and the theory of graph Laplacian.

Random Graphs and Complex Networks

  • R. Hofstad
  • Computer Science
    Cambridge Series in Statistical and Probabilistic Mathematics
  • 2016
This chapter explains why many real-world networks are small worlds and have large fluctuations in their degrees, and why Probability theory offers a highly effective way to deal with the complexity of networks, and leads us to consider random graphs.

Structural Deep Network Embedding

This paper proposes a Structural Deep Network Embedding method, namely SDNE, which first proposes a semi-supervised deep model, which has multiple layers of non-linear functions, thereby being able to capture the highly non- linear network structure and exploits the first-order and second-order proximity jointly to preserve the network structure.

node2vec: Scalable Feature Learning for Networks

In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.

GraRep: Learning Graph Representations with Global Structural Information

A novel model for learning vertex representations of weighted graphs that integrates global structural information of the graph into the learning process and significantly outperforms other state-of-the-art methods in such tasks.
...