• Corpus ID: 235166384

Hypergraph Pre-training with Graph Neural Networks

@article{Du2021HypergraphPW,
  title={Hypergraph Pre-training with Graph Neural Networks},
  author={Boxin Du and Changhe Yuan and Robert A. Barton and Tal Neiman and Hanghang Tong},
  journal={ArXiv},
  year={2021},
  volume={abs/2105.10862}
}
Despite the prevalence of hypergraphs in a variety of high-impact applications, there are relatively few works on hypergraph representation learning, most of which primarily focus on hyperlink prediction, and often restricted to the transductive learning setting. Among others, a major hurdle for effective hypergraph representation learning lies in the label scarcity of nodes and/or hyperedges. To address this issue, this paper presents an end-to-end, bi-level pre-training strategy with Graph… 

X-GOAL: Multiplex Heterogeneous Graph Prototypical Contrastive Learning

TLDR
A novel multiplex heterogeneous graph prototypical contrastive leaning (X-GOAL) framework to extract node embeddings is proposed, comprised of two components: the GOAL framework, which learns nodeembeddings for each homogeneous graph layer, and an alignment regularization, which jointly models different layers by aligning layer-specific node embeddeddings.

Graph Sanitation with Application to Node Classification

TLDR
This paper formulate the graph sanitation problem as a bilevel optimization problem, and further instantiate it by semi-supervised node classification, together with an effective solver named GaSoliNe, which brings up to 25% performance improvement over the existing robust graph neural network methods.

References

SHOWING 1-10 OF 39 REFERENCES

Deep Hyperedges: a Framework for Transductive and Inductive Learning on Hypergraphs

TLDR
This work proposes Deep Hyperedges (DHE), a modular framework that jointly uses contextual and permutation-invariant vertex membership properties of hyperedges in hypergraphs to perform classification and regression in transductive and inductive learning settings.

Hyper-SAGNN: a self-attention based graph neural network for hypergraphs

TLDR
This work develops a new self-attention based graph neural network called Hyper-SAGNN applicable to homogeneous and heterogeneous hypergraphs with variable hyperedge sizes that significantly outperforms the state-of-the-art methods on traditional tasks while also achieving great performance on a new task called outsider identification.

Dynamic Hypergraph Neural Networks

TLDR
A dynamic hypergraph neural networks framework (DHGNN), which is composed of the stacked layers of two modules: dynamic hyper graph construction ( DHG) and hypergrpah convolution (HGC), which outperforms state-of-the-art methods.

HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs

TLDR
This work proposes HyperGCN, a novel GCN for SSL on attributed hypergraphs, and shows how it can be used as a learning-based approach for combinatorial optimisation on NP-hard hypergraph problems.

Hypergraph Neural Networks

TLDR
A hypergraph neural networks framework for data representation learning, which can encode high-order data correlation in a hypergraph structure using a hyperedge convolution operation, which outperforms recent state-of-theart methods.

Self-supervised Learning on Graphs: Deep Insights and New Direction

TLDR
Inspired by deep insights from the empirical studies, a new direction SelfTask is proposed to build advanced pretext tasks that are able to achieve state-of-the-art performance on various real-world datasets.

Strategies for Pre-training Graph Neural Networks

TLDR
A new strategy and self-supervised methods for pre-training Graph Neural Networks (GNNs) that avoids negative transfer and improves generalization significantly across downstream tasks, leading up to 9.4% absolute improvements in ROC-AUC over non-pre-trained models and achieving state-of-the-art performance for molecular property prediction and protein function prediction.

Hypergraph Link Prediction: Learning Drug Interaction Networks Embeddings

  • M. VaidaKevin Purcell
  • Computer Science
    2019 18th IEEE International Conference On Machine Learning And Applications (ICMLA)
  • 2019
TLDR
This paper introduces Hypergraph Link Prediction (HLP), a novel approach of encoding the multilink structure of graphs that allows pooling operations to incorporate a 360 degrees overview of a node interaction profile, by learning local neighborhood and global hypergraph structure simultaneously.

How Much and When Do We Need Higher-order Information in Hypergraphs? A Case Study on Hyperedge Prediction

TLDR
This work proposes a method of incrementally representing group interactions using a notion of n-projected graph whose accumulation contains information on up to n-way interactions, and quantifies the accuracy of solving a task as n grows for various datasets.

Beyond Link Prediction: Predicting Hyperlinks in Adjacency Space

TLDR
A new algorithm called Coordinated Matrix Minimization (CMM) is proposed, which alternately performs nonnegative matrix factorization and least square matching in the vertex adjacency space of the hypernetwork, in order to infer a subset of candidate hyperlinks that are most suitable to fill the training hypernetwork.