• Corpus ID: 235166384

Hypergraph Pre-training with Graph Neural Networks

  title={Hypergraph Pre-training with Graph Neural Networks},
  author={Boxin Du and Changhe Yuan and Robert A. Barton and Tal Neiman and Hanghang Tong},
Despite the prevalence of hypergraphs in a variety of high-impact applications, there are relatively few works on hypergraph representation learning, most of which primarily focus on hyperlink prediction, and often restricted to the transductive learning setting. Among others, a major hurdle for effective hypergraph representation learning lies in the label scarcity of nodes and/or hyperedges. To address this issue, this paper presents an end-to-end, bi-level pre-training strategy with Graph… 

X-GOAL: Multiplex Heterogeneous Graph Prototypical Contrastive Learning

A novel multiplex heterogeneous graph prototypical contrastive leaning (X-GOAL) framework to extract node embeddings is proposed, comprised of two components: the GOAL framework, which learns nodeembeddings for each homogeneous graph layer, and an alignment regularization, which jointly models different layers by aligning layer-specific node embeds.

Augmentations in Hypergraph Contrastive Learning: Fabricated and Generative

Among fabricated augmentations in HyperGCL, augmenting hyperedges provides the most numerical gains, implying that higher-order information in structures is usually more downstream-relevant andHyperGCL also boosts robustness and fairness in hypergraph representation learning.

SuGeR: A Subgraph-based Graph Convolutional Network Method for Bundle Recommendation

A subgraph-based Graph Neural Network model, SuGeR, is proposed for bundle recommendation to handle limitations of graph-level GNN methods, and significantly outperforms the state-of-the-art baselines in the basic and the transfer bundle recommendation tasks.

Graph Sanitation with Application to Node Classification

This paper formulate the graph sanitation problem as a bilevel optimization problem, and further instantiate it by semi-supervised node classification, together with an effective solver named GaSoliNe, which brings up to 25% performance improvement over the existing robust graph neural network methods.



Deep Hyperedges: a Framework for Transductive and Inductive Learning on Hypergraphs

This work proposes Deep Hyperedges (DHE), a modular framework that jointly uses contextual and permutation-invariant vertex membership properties of hyperedges in hypergraphs to perform classification and regression in transductive and inductive learning settings.

Hyper-SAGNN: a self-attention based graph neural network for hypergraphs

This work develops a new self-attention based graph neural network called Hyper-SAGNN applicable to homogeneous and heterogeneous hypergraphs with variable hyperedge sizes that significantly outperforms the state-of-the-art methods on traditional tasks while also achieving great performance on a new task called outsider identification.

Dynamic Hypergraph Neural Networks

A dynamic hypergraph neural networks framework (DHGNN), which is composed of the stacked layers of two modules: dynamic hyper graph construction ( DHG) and hypergrpah convolution (HGC), which outperforms state-of-the-art methods.

HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs

This work proposes HyperGCN, a novel GCN for SSL on attributed hypergraphs, and shows how it can be used as a learning-based approach for combinatorial optimisation on NP-hard hypergraph problems.

Structural Deep Embedding for Hyper-Networks

It is theoretically prove that any linear similarity metric in embedding space commonly used in existing methods cannot maintain the indecomposibility property in hyper-networks, and thus a new deep model is proposed to realize a non-linear tuplewise similarity function while preserving both local and global proximities in the formedembedding space.

Hypergraph Neural Networks

A hypergraph neural networks framework for data representation learning, which can encode high-order data correlation in a hypergraph structure using a hyperedge convolution operation, which outperforms recent state-of-theart methods.

Self-supervised Learning on Graphs: Deep Insights and New Direction

Inspired by deep insights from the empirical studies, a new direction SelfTask is proposed to build advanced pretext tasks that are able to achieve state-of-the-art performance on various real-world datasets.

Strategies for Pre-training Graph Neural Networks

A new strategy and self-supervised methods for pre-training Graph Neural Networks (GNNs) that avoids negative transfer and improves generalization significantly across downstream tasks, leading up to 9.4% absolute improvements in ROC-AUC over non-pre-trained models and achieving state-of-the-art performance for molecular property prediction and protein function prediction.

Hypergraph Link Prediction: Learning Drug Interaction Networks Embeddings

  • M. VaidaKevin Purcell
  • Computer Science
    2019 18th IEEE International Conference On Machine Learning And Applications (ICMLA)
  • 2019
This paper introduces Hypergraph Link Prediction (HLP), a novel approach of encoding the multilink structure of graphs that allows pooling operations to incorporate a 360 degrees overview of a node interaction profile, by learning local neighborhood and global hypergraph structure simultaneously.

Learning to Pre-train Graph Neural Networks

L2P-GNN is proposed, a self-supervised pre-training strategy for GNNs that attempts to learn how to fine-tune during the pre- training process in the form of transferable prior knowledge.