• Corpus ID: 235349079

Embedding Heterogeneous Networks into Hyperbolic Space Without Meta-path

@inproceedings{Wang2021EmbeddingHN,
  title={Embedding Heterogeneous Networks into Hyperbolic Space Without Meta-path},
  author={Lili Wang and Chongyang Gao and Chenghan Huang and Ruibo Liu and Weicheng Ma and Soroush Vosoughi},
  booktitle={AAAI},
  year={2021}
}
Networks found in the real-world are numerous and varied. A common type of network is the heterogeneous network, where the nodes (and edges) can be of different types. Accordingly, there have been efforts at learning representations of these heterogeneous networks in low-dimensional space. However, most of the existing heterogeneous network embedding methods suffer from the following two drawbacks: (1) The target space is usually Euclidean. Conversely, many recent works have shown that complex… 

Figures and Tables from this paper

Dynamic Structural Role Node Embedding for User Modeling in Evolving Networks
TLDR
This article proposes a novel method, called HR2vec, that tracks historical topology information in dynamic networks to learn dynamic structural role embeddings and shows that this method outperforms other well-known methods in tasks where structural equivalence and historical information both play important roles.
Graph Embedding via Diffusion-Wavelets-Based Node Feature Distribution Characterization
TLDR
This paper proposes a novel unsupervised whole graph embedding method that uses spectral graph wavelets to capture topological similarities on each k-hop sub-graph between nodes and uses them to learn embeddings for the whole graph.
DA-HGT: Domain Adaptive Heterogeneous Graph Transformer
TLDR
This paper investigates Heterogeneous Information Networks (HINs) with partial shared node types and proposes a novel domain adaptive heterogeneous graph transformer (DA-HGT) to handle the domain shift between them and can outperform state-of-the-art methods in various domain adaptation tasks across heterogeneous networks.
Embedding Node Structural Role Identity Using Stress Majorization
TLDR
A novel and flexible framework using stress majorization, to transform the high-dimensional role identities in networks directly (without approximation or indirect modeling) to a low-dimensional embedding space, which achieves superior results than existing methods in learning node role representations.
Directed Graph Contrastive Learning
TLDR
A directed graph data augmentation method called Laplacian perturbation is designed and theoretically analyzed how it provides contrastive information without changing the directed graph structure, and a directed graph contrastive learning framework is presented that can retain more structural features of directed graphs than other GCL models because of its ability to provide complete Contrastive information.

References

SHOWING 1-10 OF 35 REFERENCES
Hyperbolic Heterogeneous Information Network Embedding
TLDR
This paper proposes a novel hyperbolic heterogeneous information network embedding model that has superior performance on network reconstruction and link prediction tasks but also shows its ability of capture hierarchy structure in HIN via visualization.
HEAT: Hyperbolic Embedding of Attributed Networks
TLDR
HEAT (Hyperbolic Embedding of ATributed networks), the first method for embedding attributed networks to a hyperbolic space, is introduced and shows that by leveraging node attributes, HEAT can outperform a state-of-the-art Hyperbolic embedding algorithm on several downstream tasks.
Are Meta-Paths Necessary?: Revisiting Heterogeneous Graph Embeddings
TLDR
Just, a heterogeneous graph embedding technique using random walks with JUmp and STay strategies to overcome the aforementioned bias in an more efficient manner is proposed, which can not only gracefully balance between homogeneous and heterogeneous edges, it can also balance the node distribution over different domains.
LINE: Large-scale Information Network Embedding
TLDR
A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures.
metapath2vec: Scalable Representation Learning for Heterogeneous Networks
TLDR
Two scalable representation learning models, namely metapath2vec and metapATH2vec++, are developed that are able to not only outperform state-of-the-art embedding models in various heterogeneous network mining tasks, but also discern the structural and semantic correlations between diverse network objects.
HeteSpaceyWalk: A Heterogeneous Spacey Random Walk for Heterogeneous Information Network Embedding
TLDR
This paper systematically formalizes the meta-path guided random walk as a higher-order Markov chain process, and presents a heterogeneous personalized spacey random walk to efficiently and effectively attain the expected stationary distribution among nodes.
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods.
HIN2Vec: Explore Meta-paths in Heterogeneous Information Networks for Representation Learning
TLDR
Empirical results show that HIN2Vec soundly outperforms the state-of-the-art representation learning models for network data, including DeepWalk, LINE, node2vec, PTE, HINE and ESim, by 6.6% to 23.8% of $micro$-$f_1$ in multi-label node classification and 5% to 70.8%, in link prediction.
Heterogeneous Graph Neural Network
TLDR
HetGNN, a heterogeneous graph neural network model, is proposed that can outperform state-of-the-art baselines in various graph mining tasks, i.e., link prediction, recommendation, node classification and clustering and inductive node classification & clustering.
Scalable Graph Embedding for Asymmetric Proximity
TLDR
This paper proposes an asymmetric proximity preserving (APP) graph embedding method via random walk with restart, which captures both asymmetric and high-order similarities between node pairs, and gives theoretical analysis that this method implicitly preserves the Rooted PageRank score for any two vertices.
...
1
2
3
4
...