DeepWalk: online learning of social representations

@article{Perozzi2014DeepWalkOL,
  title={DeepWalk: online learning of social representations},
  author={Bryan Perozzi and Rami Al-Rfou and Steven Skiena},
  journal={Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining},
  year={2014}
}
We present DeepWalk, a novel approach for learning latent representations of vertices in a network. [...] Key MethodDeepWalk uses local information obtained from truncated random walks to learn latent representations by treating walks as the equivalent of sentences. We demonstrate DeepWalk's latent representations on several multi-label network classification tasks for social networks such as BlogCatalog, Flickr, and YouTube. Our results show that DeepWalk outperforms challenging baselines which are allowed a…Expand
Don't Walk, Skip!: Online Learning of Multi-scale Network Embeddings
TLDR
The results show that WALKLETS outperforms new methods based on neural matrix factorization, and outperform DeepWalk by up to 10% and LINE by 58% Micro-F1 on challenging multi-label classification tasks. Expand
Max-Margin DeepWalk: Discriminative Learning of Network Representation
TLDR
MMDW is a unified NRL framework that jointly optimizes the max-margin classifier and the aimed social representation learning model, and indicates that the model is more discriminative than unsupervised ones, and the experimental results demonstrate that the method achieves a significant improvement than other state-of-the-art methods. Expand
Discriminative Deep Random Walk for Network Classification
TLDR
This paper presents Discriminative Deep Random Walk (DDRW), a novel method for relational network classification that significantly outperforms DeepWalk on multilabel network classification tasks, while retaining the topological structure in the latent space. Expand
subgraph2vec: Learning Distributed Representations of Rooted Sub-graphs from Large Graphs
TLDR
It is demonstrated that subgraph vectors learnt by the approach could be used in conjunction with classifiers such as CNNs, SVMs and relational data clustering algorithms to achieve significantly superior accuracies on both supervised and unsupervised learning tasks. Expand
Learning Edge Representations via Low-Rank Asymmetric Projections
TLDR
This work proposes a new method for embedding graphs while preserving directed edge information, and explicitly model an edge as a function of node embeddings, and proposes a novel objective, the graph likelihood, which contrasts information from sampled random walks with non-existent edges. Expand
Learning distributed representations for large-scale dynamic social networks
TLDR
This work proposes Dnps, a novel nodes embedding approach for acquiring distributed representations of large-scale dynamic social networks, and develops a damping based positive sampling (DpS) algorithm to learn the hierarchical structure of social networks. Expand
DeepInf: Social Influence Prediction with Deep Learning
TLDR
Inspired by the recent success of deep neural networks in a wide range of computing applications, an end-to-end framework to learn users' latent feature representation for predicting social influence is designed, suggesting the effectiveness of representation learning for social applications. Expand
Efficient Estimation of Node Representations in Large Graphs using Linear Contexts
TLDR
This paper proposes a simple alternate method which is as effective as previous methods, but being much faster at learning node representations, and employs a restricted number of permutations over the immediate neighborhood of a node as context to generate its representation. Expand
TNE: A Latent Model for Representation Learning on Networks
TLDR
This paper introduces a general framework to enhance the embeddings of nodes acquired by means of the random walk-based approaches, and assigns each vertex to a topic with the favor of various statistical models and community detection methods, and then generates the enhanced community representations. Expand
Fast Node Embeddings: Learning Ego-Centric Representations
TLDR
An effective and also efficient method for generating node embeddings in graphs that employs a restricted number of permutations over the immediate neighborhood of a node as context to generate its representation, thus ego-centric representations is proposed. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 57 REFERENCES
Relational learning via latent social dimensions
TLDR
This work proposes to extract latent social dimensions based on network information, and then utilize them as features for discriminative learning, and outperforms representative relational learning methods based on collective inference, especially when few labeled data are available. Expand
Scalable learning of collective behavior based on sparse social dimensions
TLDR
This work proposes an edge-centric clustering scheme to extract sparse social dimensions that can efficiently handle networks of millions of actors while demonstrating comparable prediction performance as other non-scalable methods. Expand
Using ghost edges for classification in sparsely labeled networks
TLDR
This paper proposes a novel approach to within-network classification that combines aspects of statistical relational learning and semi-supervised learning to improve classification performance in sparse networks and demonstrates that this approach performs well across a range of conditions where existing approaches fail. Expand
Leveraging social media networks for classification
TLDR
The proposed framework, SocioDim, first extracts social dimensions based on the network structure to accurately capture prominent interaction patterns between actors, then learns a discriminative classifier to select relevant social dimensions. Expand
Leveraging Label-Independent Features for Classification in Sparsely Labeled Networks: An Empirical Study
TLDR
This work explores a complimentary approach to within-network classification, based on the use of label-independent (LI) features - i.e., features calculated without using the values of class labels - and shows that, in many cases, it is a combination of a few diverse network-based structural characteristics that is most informative. Expand
It's who you know: graph mining using recursive structural features
TLDR
ReFeX (Recursive Feature eXtraction), a novel algorithm, that recursively combines local features with neighborhood features; and outputs regional features -- capturing "behavioral" information in large graphs, is proposed. Expand
Leveraging relational autocorrelation with latent group models
TLDR
A latent group model (LGM) is proposed for relational data, which discovers and exploits the hidden structures responsible for the observed autocorrelation among class labels and improves model performance, increases inference efficiency, and enhances the understanding of the datasets. Expand
Multi-label relational neighbor classification using social context features
TLDR
This paper proposes a multi-label iterative relational neighbor classifier that employs social context features (SCRN), which incorporates a class propagation probability distribution obtained from instances' social features, which are in turn extracted from the network topology. Expand
Representation Learning: A Review and New Perspectives
TLDR
Recent work in the area of unsupervised feature learning and deep learning is reviewed, covering advances in probabilistic models, autoencoders, manifold learning, and deep networks. Expand
Semi-Supervised Classification of Network Data Using Very Few Labels
  • Frank Lin, William W. Cohen
  • Computer Science
  • 2010 International Conference on Advances in Social Networks Analysis and Mining
  • 2010
TLDR
Another simple and intuitive semi-supervised learning method based on random graph walk that outperforms wvRN by a large margin on several benchmark datasets when very few labels are available and dramatically reduces the amount of labeled data required to achieve the same classification accuracy. Expand
...
1
2
3
4
5
...