Kernel Node Embeddings

  title={Kernel Node Embeddings},
  author={Abdulkadir Çelikkanat and Fragkiskos D. Malliaros},
  journal={2019 IEEE Global Conference on Signal and Information Processing (GlobalSIP)},
Learning representations of nodes in a low dimensional space is a crucial task with many interesting applications in network analysis, including link prediction and node classifi-cation. Two popular approaches for this problem include matrix factorization and random walk-based models. In this paper, we aim to bring together the best of both worlds, towards learning latent node representations. In particular, we propose a weighted matrix factorization model which encodes random walk-based… Expand
Exponential Family Graph Embeddings
This paper introduces the generic \textit{exponential family graph embedding} model, that generalizes random walk-based network representation learning techniques to exponential family conditional distributions and demonstrates that the proposed techniques outperform well-known baseline methods in two downstream machine learning tasks. Expand
Topic-aware latent models for representation learning on networks
TNE is introduced, a generic framework to enhance the embeddings of nodes acquired by means of random walk-based approaches with topic-based information, and is able to outperform widely-known baseline NRL models. Expand


TNE: A Latent Model for Representation Learning on Networks
This paper introduces a general framework to enhance the embeddings of nodes acquired by means of the random walk-based approaches, and assigns each vertex to a topic with the favor of various statistical models and community detection methods, and then generates the enhanced community representations. Expand
Is a Single Embedding Enough? Learning Node Representations that Capture Multiple Social Contexts
This work proposes a method for learning multiple representations of the nodes in a graph based on a principled decomposition of the ego-network, and shows that these embeddings allow for effective visual analysis of the learned community structure. Expand
BiasedWalk: Biased Sampling for Representation Learning on Graphs
Biased-Walk is proposed, a scalable, unsupervised feature learning algorithm that is based on biased random walks to sample context information about each node in the network, which outperforms the baseline ones in most of the tasks and datasets. Expand
Representation Learning on Graphs: Methods and Applications
A conceptual review of key advancements in this area of representation learning on graphs, including matrix factorization-based methods, random-walk based algorithms, and graph neural networks are provided. Expand
node2vec: Scalable Feature Learning for Networks
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods. Expand
Don't Walk, Skip!: Online Learning of Multi-scale Network Embeddings
The results show that WALKLETS outperforms new methods based on neural matrix factorization, and outperform DeepWalk by up to 10% and LINE by 58% Micro-F1 on challenging multi-label classification tasks. Expand
NetSMF: Large-Scale Network Embedding as Sparse Matrix Factorization
NetSMF leverages theories from spectral sparsification to efficiently sparsify the aforementioned dense matrix, enabling significantly improved efficiency in embedding learning and is the only method that achieves both high efficiency and effectiveness. Expand
GraRep: Learning Graph Representations with Global Structural Information
A novel model for learning vertex representations of weighted graphs that integrates global structural information of the graph into the learning process and significantly outperforms other state-of-the-art methods in such tasks. Expand
Node embedding for network community discovery
  • Christy Lin, P. Ishwar, W. Ding
  • Computer Science
  • 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2017
This work empirically attains the information theoretic limits under the benchmark Stochastic Block Models and exhibits better stability and accuracy over the best known algorithms in the community recovery limits. Expand
LINE: Large-scale Information Network Embedding
A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures. Expand