Quantifying the Alignment of Graph and Features in Deep Learning

@article{Qian2022QuantifyingTA,
  title={Quantifying the Alignment of Graph and Features in Deep Learning},
  author={Yifan Qian and Paul Expert and Tom Rieu and Pietro Panzarasa and Mauricio Barahona},
  journal={IEEE Transactions on Neural Networks and Learning Systems},
  year={2022},
  volume={33},
  pages={1663-1672}
}
We show that the classification performance of graph convolutional networks (GCNs) is related to the alignment between features, graph, and ground truth, which we quantify using a subspace alignment measure (SAM) corresponding to the Frobenius norm of the matrix of pairwise chordal distances between three subspaces associated with features, graph, and ground truth. The proposed measure is based on the principal angles between subspaces and has both spectral and geometrical interpretations. We… 

Figures and Tables from this paper

Graph convolutional and attention models for entity classification in multilayer networks

TLDR
This work instantiate a GNN framework for representation learning and semi-supervised classification in multilayer networks with attributed entities, and arbitrary number of layers and intra-layer and inter-layer connections between nodes, and shows how these methods are able to take advantage of the presence of real attributes for the entities.

Graph convolutional networks fusing motif-structure information

TLDR
This model fuses the motif-structure information of nodes to study the aggregation feature weights, which enables nodes to aggregate higher-order network information, thus improving the capability of GCN model.

Understanding deep learning via decision boundary

TLDR
It is discovered that the neural network with lower decision boundary (DB) variability has better generalizability and an upper bound of order O (cid:16) 1 √ m + ( cid:15) + η log 1 η (cID:17) based on data DB variability is proved.

Graph-based representation for identifying individual travel activities with spatiotemporal trajectories and POI data

TLDR
A graph-based representation of spatiotemporal trajectories and point-of-interest (POI) data for travel activity type identification, defined as Gstp2Vec, which significantly reduces feature engineering efforts and enhances model generalizability, and obtains better efficiency and robustness.

DGMP: Identifying Cancer Driver Genes by Jointing DGCN and MLP from Multi-Omics Genomic Data

TLDR
DGMP is a novel method to identify cancer driver genes by jointing Directed Graph Convolution Network (DGCN) and Multilayer Perceptron (MLP), which learns the multi-omics features of genes as well as the topological structure features in GRN with D GCN model, and uses MLP to weight more on gene features for mitigating the bias toward the graph topological features in DGCN learning process.

Characterising contact in disease outbreaks via a network model of spatial-temporal proximity

TLDR
The StEP model reveals missing contacts that connect seemingly separate outbreaks that improve characterisation of disease transmission and highlights how the StEP framework can inform effective strategies of infection control and prevention.

Semi-supervised classification on graphs using explicit diffusion dynamics

TLDR
It is shown that appending graph diffusion to feature-based learning as an \textit{a posteriori} refinement achieves state-of-the-art classification accuracy.

Analysis of Nomadic Civilization in Northern Grassland in Plastic Arts Based on Deep Learning

In this era of rapid development, the exchanges between countries are increasing rapidly, which leads to the integration of multiculturalism and its impact on the local culture, making it diluted.

References

SHOWING 1-10 OF 36 REFERENCES

Semi-Supervised Classification with Graph Convolutional Networks

TLDR
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.

Deep Convolutional Networks on Graph-Structured Data

TLDR
This paper develops an extension of Spectral Networks which incorporates a Graph Estimation procedure, that is test on large-scale classification problems, matching or improving over Dropout Networks with far less parameters to estimate.

How Powerful are Graph Neural Networks?

TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.

Simplifying Graph Convolutional Networks

TLDR
This paper successively removes nonlinearities and collapsing weight matrices between consecutive layers, and theoretically analyze the resulting linear model and show that it corresponds to a fixed low-pass filter followed by a linear classifier.

A Comprehensive Survey on Graph Neural Networks

TLDR
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.

Learning Convolutional Neural Networks for Graphs

TLDR
This work proposes a framework for learning convolutional neural networks for arbitrary graphs that operate on locally connected regions of the input and demonstrates that the learned feature representations are competitive with state of the art graph kernels and that their computation is highly efficient.

Spectral Networks and Locally Connected Networks on Graphs

TLDR
This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.

Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

TLDR
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.

A new model for learning in graph domains

TLDR
A new neural model, called graph neural network (GNN), capable of directly processing graphs, which extends recursive neural networks and can be applied on most of the practically useful kinds of graphs, including directed, undirected, labelled and cyclic graphs.

Inductive Representation Learning on Large Graphs

TLDR
GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.