TGE-PS: Text-driven Graph Embedding with Pairs Sampling

@article{Chen2018TGEPSTG,
  title={TGE-PS: Text-driven Graph Embedding with Pairs Sampling},
  author={Liheng Chen and Yanru Qu and Zhenghui Wang and Lin Qiu and Weinan Zhang and Ken Chen and Shaodian Zhang and Yong Yu},
  journal={ArXiv},
  year={2018},
  volume={abs/1809.04234}
}
In graphs with rich text information, constructing expressive graph representations requires incorporating textual information with structural information. Graph embedding models are becoming more and more popular in representing graphs, yet they are faced with two issues: sampling efficiency and text utilization. Through analyzing existing models, we find their training objectives are composed of pairwise proximities, and there are large amounts of redundant node pairs in Random Walk-based… Expand

References

SHOWING 1-10 OF 31 REFERENCES
Graph Embedding Techniques, Applications, and Performance: A Survey
TLDR
A comprehensive and structured analysis of various graph embedding techniques proposed in the literature, and the open-source Python library, named GEM (Graph Embedding Methods, available at https://github.com/palash1992/GEM ), which provides all presented algorithms within a unified interface to foster and facilitate research on the topic. Expand
A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications
TLDR
This survey conducts a comprehensive review of the literature in graph embedding and proposes two taxonomies ofGraph embedding which correspond to what challenges exist in differentgraph embedding problem settings and how the existing work addresses these challenges in their solutions. Expand
CANE: Context-Aware Network Embedding for Relation Modeling
TLDR
Context-Aware Network Embedding (CANE), a novel NE model that learns context-aware embeddings for vertices with mutual attention mechanism and is expected to model the semantic relationships between vertices more precisely, is presented. Expand
Content to Node: Self-Translation Network Embedding
TLDR
A novel sequence-to-sequence model based NE framework which is referred to as Self-Translation Network Embedding (STNE) model, which outperforms the state-of-the-art NE approaches and fuses the content and structure information seamlessly from the raw input. Expand
Paper2vec: Combining Graph and Text Information for Scientific Paper Representation
TLDR
This work presents Paper2vec, a novel neural network embedding based approach for creating scientific paper representations which make use of both textual and graph-based information, and demonstrates the efficacy of the representations on three real world academic datasets. Expand
Network Representation Learning with Rich Text Information
TLDR
By proving that DeepWalk, a state-of-the-art network representation method, is actually equivalent to matrix factorization (MF), this work proposes text-associated DeepWalk (TADW), which incorporates text features of vertices into network representation learning under the framework of Matrix factorization. Expand
LINE: Large-scale Information Network Embedding
TLDR
A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures. Expand
Deep Neural Networks for Learning Graph Representations
TLDR
A novel model for learning graph representations, which generates a low-dimensional vector representation for each vertex by capturing the graph structural information directly, and which outperforms other stat-of-the-art models in such tasks. Expand
GraRep: Learning Graph Representations with Global Structural Information
TLDR
A novel model for learning vertex representations of weighted graphs that integrates global structural information of the graph into the learning process and significantly outperforms other state-of-the-art methods in such tasks. Expand
Asymmetric Transitivity Preserving Graph Embedding
TLDR
A novel graph embedding algorithm, High-Order Proximity preserved Embedding (HOPE for short), is developed, which is scalable to preserve high-order proximities of large scale graphs and capable of capturing the asymmetric transitivity. Expand
...
1
2
3
4
...