• Corpus ID: 232404318

A nonlinear diffusion method for semi-supervised learning on hypergraphs

  title={A nonlinear diffusion method for semi-supervised learning on hypergraphs},
  author={Francesco Tudisco and Konstantin Prokopchik and Austin R. Benson},
Hypergraphs are a common model for multiway relationships in data, and hypergraph semisupervised learning is the problem of assigning labels to all nodes in a hypergraph, given labels on just a few nodes. Diffusions and label spreading are classical techniques for semi-supervised learning in the graph setting, and there are some standard ways to extend them to hypergraphs. However, these methods are linear models, and do not offer an obvious way of incorporating node features for making… 

Figures and Tables from this paper

Equivariant Hypergraph Diffusion Neural Operators

This work proposes a new HNN architecture named ED-HNN, which provably approximates any continuous equivariant hypergraph diffusion operators that can model a wide range of higher-order relations and shows great superiority in processing heterophilic hypergraphs and constructing deep models.

Information Limits for Community Detection in Hypergraph with Label Information

This work investigating the effect of label information on the exact recovery of communities in an m-uniform Hypergraph Stochastic Block Model (HSBM) derives sharp boundaries for exact recovery under both scenarios from an information-theoretical point of view.

Core-periphery detection in hypergraphs

This work proposes a model of core-periphery for higher-order networks modeled as hypergraphs and proposes a method for computing a core-score vector that quantifies how close each node is to the core.

A flexible PageRank-based graph embedding framework closely related to spectral eigenvector embeddings

The general nature of this embedding strategy opens up many emerging applications, where eigenvector and spectral techniques may not be well established, to the PageRank-based relatives, for instance, similar techniques can be used on PageRank vectors from hypergraphs to get “spectral-like” embeddings.

You are AllSet: A Multiset Function Framework for Hypergraph Neural Networks

AllSet is proposed, a new hypergraph neural network paradigm that represents a highly general framework for (hyper)graph neural networks and for the first time implements hyper graph neural network layers as compositions of two multiset functions that can be efficiently learned for each task and each dataset.

A Survey on Hyperlink Prediction

A new taxonomy is proposed to classify existing hyperlink prediction methods into four categories: similarity- based, probability-based, matrix optimized, and deep learning-based methods.



Strongly Local Hypergraph Diffusions for Clustering and Semi-supervised Learning

A new diffusion-based hypergraph clustering algorithm that solves a quadratic hypergraph cut based objective akin to a hypergraph analog of Andersen-Chung-Lang personalized PageRank clustering for graphs is proposed and it is proved that, for graphs with fixed maximum hyperedge size, this method is strongly local, meaning that its runtime only depends on the size of the output instead of thesize of the hypergraph and is highly scalable.

Using Local Spectral Methods to Robustify Graph-Based Learning Algorithms

This work studies robustness with respect to the details of graph constructions, errors in node labeling, degree variability, and a variety of other real-world heterogeneities, studying these methods through a precise relationship with mincut problems.

Nonlinear Diffusion for Community Detection and Semi-Supervised Learning

This paper illustrates a class of nonlinear graph diffusions that are competitive with state of the art embedding techniques and outperform classic diffusions and demonstrates the benefits of these techniques on a variety of synthetic and real-world data.

HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs

This work proposes HyperGCN, a novel GCN for SSL on attributed hypergraphs, and shows how it can be used as a learning-based approach for combinatorial optimisation on NP-hard hypergraph problems.

Learning over Families of Sets - Hypergraph Representation Learning for Higher Order Tasks

This work exploits the incidence structure to develop a hypergraph neural network to learn provably expressive representations of variable sized hyperedges which preserve local-isomorphism in the line graph of the hypergraph, while also being invariant to permutations of its constituent vertices.

Nonlinear Higher-Order Label Spreading

This work proves convergence of the nonlinear higher-order label spreading algorithm to the global solution of an interpretable semi-supervised loss function and demonstrates the efficiency and efficacy of the approach on a variety of point cloud and network datasets.

A Unifying Generative Model for Graph Learning Algorithms: Label Propagation, Graph Convolutions, and Combinations

A Markov random field model is developed for the data generation process of node attributes, based on correlations of attributes on and between vertices, that motivates and unifies these algorithmic approaches to semi-supervised learning on graphs.

Minimizing Localized Ratio Cut Objectives in Hypergraphs

This work presents a framework for local hypergraph clustering based on minimizing localized ratio cut objectives and uses it to effectively identify clusters in hypergraphs of real-world data with millions of nodes, millions of hyperedges, and large average hyperedge size with runtimes ranging between a few seconds and a few minutes.

Learning with Hypergraphs: Clustering, Classification, and Embedding

This paper generalizes the powerful methodology of spectral clustering which originally operates on undirected graphs to hypergraphs, and further develop algorithms for hypergraph embedding and transductive classification on the basis of the spectral hypergraph clustering approach.

Hypergraph p-Laplacian: A Differential Geometry View

The proposed p-Laplacian is shown to outperform standard hypergraph Laplacians in the experiment on a hypergraph semi-supervised learning and normalized cut setting, and is formalized as the analogue to the Dirichlet problem.