• Corpus ID: 239998097

Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classification

@article{Stadler2021GraphPN,
  title={Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classification},
  author={Maximilian Stadler and Bertrand Charpentier and Simon Geisler and Daniel Zugner and Stephan Gunnemann},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.14012}
}
The interdependence between nodes in graphs is key to improve class predictions on nodes and utilized in approaches like Label Propagation (LP) or in Graph Neural Networks (GNNs). Nonetheless, uncertainty estimation for non-independent node-level predictions is under-explored. In this work, we explore uncertainty quantification for node classification in three ways: (1) We derive three axioms explicitly characterizing the expected predictive uncertainty behavior in homophilic attributed graphs… 

References

SHOWING 1-10 OF 111 REFERENCES
Node classification in uncertain graphs
TLDR
This paper proposes two techniques based on a Bayes model, and shows the benefits of incorporating uncertainty in the classification process as a first-class citizen, and the experimental results demonstrate the effectiveness of these approaches.
Bayesian Graph Convolutional Neural Networks Using Non-Parametric Graph Learning
TLDR
This paper proposes a non-parametric generative model for graphs and incorporates it within the BGCN framework and effectively uses the node features and training labels in the posterior inference of graphs and attains superior or comparable performance in benchmark node classification tasks.
Bayesian graph convolutional neural networks for semi-supervised classification
TLDR
A Bayesian GCNN framework is presented and an iterative learning procedure for the case of assortative mixed-membership stochastic block models is developed, demonstrating that the Bayesian formulation can provide better performance when there are very few labels available during the training process.
The Power of Certainty: A Dirichlet-Multinomial Model for Belief Propagation
TLDR
This work formalizes axioms that any node classification algorithm should obey and proposes NetConf which satisfies these axiom and handles arbitrary network e↵ects (homophily/heterophily) at scale.
Bayesian Graph Convolutional Neural Networks using Node Copying
TLDR
This paper introduces an alternative generative model for graphs based on copying nodes and incorporate it within the BGCN framework and shows that the proposed algorithm compares favorably to the state-of-the-art in benchmark node classification tasks.
Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking
TLDR
Graph2Gauss is proposed - an approach that can efficiently learn versatile node embeddings on large scale (attributed) graphs that show strong performance on tasks such as link prediction and node classification and the benefits of modeling uncertainty are demonstrated.
Predict then Propagate: Graph Neural Networks meet Personalized PageRank
TLDR
This paper uses the relationship between graph convolutional networks (GCN) and PageRank to derive an improved propagation scheme based on personalized PageRank, and constructs a simple model, personalized propagation of neural predictions (PPNP), and its fast approximation, APPNP.
Uncertainty Aware Graph Gaussian Process for Semi-Supervised Learning
TLDR
An Uncertainty aware Graph Gaussian Process based approach (UaGGP) for GSSL that exploits the prediction uncertainty and label smooth regularization to guide each other during learning.
Combining Label Propagation and Simple Models Out-performs Graph Neural Networks
TLDR
This work shows that for many standard transductive node classification benchmarks, it can exceed or match the performance of state-of-the-art GNNs by combining shallow models that ignore the graph structure with two simple post-processing steps that exploit correlation in the label structure.
BayesGrad: Explaining Predictions of Graph Convolutional Networks
TLDR
It is demonstrated that BayesGrad successfully visualizes the substructures responsible for the label prediction in the artificial experiment, even when the sample size is small.
...
1
2
3
4
5
...