Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution

@article{Yang2022TimeawareDG,
  title={Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution},
  author={Yu Yang and Hongzhi Yin and Jiannong Cao and Tong Chen and Quoc Viet Hung Nguyen and Xiaofang Zhou and Lei Chen},
  journal={ArXiv},
  year={2022},
  volume={abs/2207.00594}
}
—Dynamic graphs refer to graphs whose structure dynamically changes over time. Despite the benefits of learning vertex representations (i.e., embeddings) for dynamic graphs, existing works merely view a dynamic graph as a sequence of changes within the vertex connections, neglecting the crucial asynchronous nature of such dynamics where the evolution of each local structure starts at different times and lasts for various durations. To maintain asynchronous structural evolutions within the graph… 

References

SHOWING 1-10 OF 37 REFERENCES

Time-capturing Dynamic Graph Embedding for Temporal Linkage Evolution

TLDR
This work model dynamic graphs as a sequence of snapshot graphs, appending the respective timespans of edges (ToE) to co-train a linear regressor to embed ToE while inferring a common latent space for all snapshot graphs by a matrix-factorization-based model to embed vertices' dynamic connection changes.

TemporalGAT: Attention-Based Dynamic Graph Representation Learning

TLDR
A deep attention model is proposed to learn low-dimensional feature representations which preserves the graph structure and features among series of graph snapshots over time which is competitive against various state-of-the-art methods.

Temporal Graph Networks for Deep Learning on Dynamic Graphs

TLDR
This paper presents Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events that significantly outperform previous approaches being at the same time more computationally efficient.

A Data-Driven Graph Generative Model for Temporal Interaction Networks

TLDR
This work proposes an end-to-end deep generative framework named TagGen, which outperforms all baselines in the temporal interaction network generation problem, and significantly boosts the performance of the prediction models in the tasks of anomaly detection and link prediction.

Continuous-Time Dynamic Network Embeddings

TLDR
The proposed framework gives rise to methods for learning time-respecting embeddings from continuous-time dynamic networks and indicates that modeling temporal dependencies in graphs is important for learning appropriate and meaningful network representations.

Dynamic Network Embedding by Modeling Triadic Closure Process

TLDR
This paper presents a novel representation learning approach, DynamicTriad, to preserve both structural information and evolution patterns of a given network, and demonstrates that, compared with several state-of-the-art techniques, this approach achieves substantial gains in several application scenarios.

DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks

TLDR
DySAT is a novel neural architecture that learns node representations to capture dynamic graph structural evolution and ablation study validates the effectiveness of jointly modeling structural and temporal self-attention.

EPNE: Evolutionary Pattern Preserving Network Embedding

TLDR
This paper proposes EPNE, a temporal network embedding model preserving evolutionary patterns of the local structure of nodes based on causal convolutions, and proposes a temporal objective function which is optimized simultaneously with proximity ones such that both temporal and structural information are preserved.

DynGraphGAN: Dynamic Graph Embedding via Generative Adversarial Networks

TLDR
A Generative Adversarial Networks (GAN) based model, named DynGraphGAN, to learn robust feature representations that can preserve spatial structure with temporal dependency and demonstrates substantial gains over several baseline models in link prediction and reconstruction tasks on real-world datasets.