Multi-Level Network Embedding with Boosted Low-Rank Matrix Approximation

@article{Li2019MultiLevelNE,
  title={Multi-Level Network Embedding with Boosted Low-Rank Matrix Approximation},
  author={Jundong Li and Liang Wu and Huan Liu},
  journal={2019 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM)},
  year={2019},
  pages={49-56}
}
  • Jundong LiLiang WuHuan Liu
  • Published 26 August 2018
  • Computer Science
  • 2019 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM)
As opposed to manual feature engineering which is tedious and difficult to scale, network embedding has attracted a surge of research interests as it automates the feature learning on graphs. [] Key Method The proposed BoostNE method is also in line with the successful gradient boosting method in ensemble learning. We demonstrate the superiority of the proposed BoostNE framework by comparing it with existing state-of-the-art network embedding methods on various datasets.

Figures and Tables from this paper

Multi-Stage Network Embedding for Exploring Heterogeneous Edges

This article proposes a multi-stage non-negative matrix factorization (MNMF) model, committed to utilizing abundant information in multiple views to learn robust network representations, and demonstrates that the model outperforms three types of baselines in practical applications.

Network Embedding via Motifs

This work presents an algorithm for Learning Embeddings by leveraging Motifs Of Networks (LEMON), which aims to learn embeddings for vertices and various motifs, and finds that LEMON achieves significant improvements in downstream tasks.

Sequential Semi-Orthogonal Multi-Level NMF with Negative Residual Reduction for Network Embedding

  • Riku HashimotoHiroyuki Kasai
  • Computer Science
    ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2020
Numerical evaluations conducted using several real-world datasets demonstrate the effectiveness of the proposed SSO-NRR-NMF, which reduces the negative residuals to be discarded, and avoids redundant bases with a semi-orthogonal constraint.

Modeling Heterogeneous Edges to Represent Networks with Graph Auto-Encoder

A regularized graph auto-encoders (RGAE) model, committed to utilizing abundant information in multiple views to learn robust network representations, which outperforms state-of-the-art baselines in practical applications.

Research on the Link Prediction Model of Dynamic Multiplex Social Network Based on Improved Graph Representation Learning

The dynamic graph representation learning is studied so as to put forward an improved link prediction model in dynamic social network that performed better than traditional graph representationlearning methods.

Embedding Methods or Link-based Similarity Measures, Which is Better for Link Prediction?

  • M. HamedaniSang-Wook Kim
  • Computer Science
    2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)
  • 2021
This paper extensively investigates the effectiveness of embedding methods and similarity measures (i.e., both non-recursive and recursive ones) in link prediction and demonstrates that recursive similarity measures are not beneficial in this task than non-Recursive one.

Graph Embeddings for Abusive Language Detection

This paper proposes to use recent graph embedding approaches to automatically learn representations of conversational graphs depicting message exchanges, and compares two categories: node vs. whole-graph embeddings.

On Investigating Both Effectiveness and Efficiency of Embedding Methods in Task of Similarity Computation of Nodes in Graphs

This paper investigates both effectiveness and efficiency of embedding methods in the task of similarity computation of nodes by comparing them with those of similarity measures and finds that with all datasets, they show less effectiveness than similarity measures except for one dataset.

On the Use of Unrealistic Predictions in Hundreds of Papers Evaluating Graph Representations

It is pointed out that such an inappropriate setting is now ubiquitous in this research area and indicates that with unrealistic information, the performance is likely over-estimated.

Karate Club: An API Oriented Open-Source Python Framework for Unsupervised Learning on Graphs

Karate Club - a Python framework combining more than 30 state-of-the-art graph mining algorithms that make it easy to identify and represent common graph features, and its efficiency in learning performance on a wide range of real world clustering problems and classification tasks is shown.

References

SHOWING 1-10 OF 55 REFERENCES

Accelerated Attributed Network Embedding

An accelerated attributed network embedding algorithm AANE is proposed, which enables the joint learning process to be done in a distributed manner by decomposing the complex modeling and optimization into many sub-problems.

Community Preserving Network Embedding

A novel Modularized Nonnegative Matrix Factorization (M-NMF) model is proposed to incorporate the community structure into network embedding and jointly optimize NMF based representation learning model and modularity based community detection model in a unified framework, which enables the learned representations of nodes to preserve both of the microscopic and community structures.

MILE: A Multi-Level Framework for Scalable Graph Embedding

Experimental results on five large-scale datasets demonstrate that MILE significantly boosts the speed (order of magnitude) of graph embedding while generating embeddings of better quality, for the task of node classification.

Attributed Network Embedding for Learning in a Dynamic Environment

DANE first provides an offline method for a consensus embedding and then leverages matrix perturbation theory to maintain the freshness of the end embedding results in an online manner, and performs extensive experiments to corroborate the effectiveness and efficiency of the proposed framework.

Deep Neural Networks for Learning Graph Representations

A novel model for learning graph representations, which generates a low-dimensional vector representation for each vertex by capturing the graph structural information directly, and which outperforms other stat-of-the-art models in such tasks.

Label Informed Attributed Network Embedding

A novel Label informed Attributed Network Embedding (LANE) framework that can smoothly incorporate label information into the attributed network embedding while preserving their correlations is proposed and achieves significantly better performance compared with the state-of-the-art embedding algorithms.

Fast Network Embedding Enhancement via High Order Proximity Approximation

Most existing NRL methods are summarized into a unified two-step framework, including proximity matrix construction and dimension reduction, and Network Embedding Update (NEU) algorithm is proposed which implicitly approximates higher order proximities with theoretical approximation bound.

Representation Learning on Graphs: Methods and Applications

A conceptual review of key advancements in this area of representation learning on graphs, including matrix factorization-based methods, random-walk based algorithms, and graph neural networks are provided.

Structural Deep Network Embedding

This paper proposes a Structural Deep Network Embedding method, namely SDNE, which first proposes a semi-supervised deep model, which has multiple layers of non-linear functions, thereby being able to capture the highly non- linear network structure and exploits the first-order and second-order proximity jointly to preserve the network structure.

Network Embedding as Matrix Factorization: Unifying DeepWalk, LINE, PTE, and node2vec

The NetMF method offers significant improvements over DeepWalk and LINE for conventional network mining tasks and provides the theoretical connections between skip-gram based network embedding algorithms and the theory of graph Laplacian.
...