Corpus ID: 141460877

MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing

@inproceedings{AbuElHaija2019MixHopHG,
  title={MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing},
  author={Sami Abu-El-Haija and Bryan Perozzi and Amol Kapoor and Hrayr Harutyunyan and Nazanin Alipourfard and Kristina Lerman and Greg Ver Steeg and A. G. Galstyan},
  booktitle={ICML},
  year={2019}
}
Existing popular methods for semi-supervised learning with Graph Neural Networks (such as the Graph Convolutional Network) provably cannot learn a general class of neighborhood mixing relationships. To address this weakness, we propose a new model, MixHop, that can learn these relationships, including difference operators, by repeatedly mixing feature representations of neighbors at various distances. Mixhop requires no additional memory or computational complexity, and outperforms on… Expand
Stacked Graph Filter
TLDR
The treatment here relaxes the low-frequency (or equivalently, high homophily) assumptions in existing vertex classification models, resulting a more ubiquitous solution in terms of spectral properties, achieving strong results on most benchmark datasets across the frequency spectrum. Expand
Simple Spectral Graph Convolution
TLDR
This paper uses a modified Markov Diffusion Kernel to derive a variant of GCN called Simple Spectral Graph Convolution (S2GC), and spectral analysis shows that the simple spectral graph convolution used in S2GC is a trade-off of low and high-pass filter bands which capture the global and local contexts of each node. Expand
Higher-Order Graph Convolutional Networks With Multi-Scale Neighborhood Pooling for Semi-Supervised Node Classification
Existing popular methods for semi-supervised node classification with high-order convolution improve the learning ability of graph convolutional networks (GCNs) by capturing the feature informationExpand
Semi-supervised learning with mixed-order graph convolutional networks
  • Jie Wang, Jianqing Liang, Junbiao Cui, Jiye Liang
  • Computer Science
  • Inf. Sci.
  • 2021
TLDR
A novel end-to-end ensemble framework, which is named mixed-order graph convolutional networks (MOGCN), which employs a novel ensemble module, in which the pseudo-labels of unlabeled nodes from various GCN learners are used to augment the diversity among the learners. Expand
MulStepNET: stronger multi-step graph convolutional networks via multi-power adjacency matrix combination
TLDR
This work proposes MulStepNET, a stronger multi-step graph convolutional network architecture that can capture more global information, by simultaneously combining multi- step neighborhoods information, and achieves better performance in terms of accuracy and stability compared to other baselines. Expand
N-GCN: Multi-scale Graph Convolution for Semi-supervised Node Classification
TLDR
The proposed N-GCN model improves state-of-the-art baselines on all of the challenging node classification tasks the authors consider: Cora, Citeseer, Pubmed, and PPI, and has other desirable properties, including generalization to recently proposed semi-supervised learning methods such as GraphSAGE, and resilience to adversarial input perturbations. Expand
Robust Graph Representation Learning via Neural Sparsification
TLDR
This paper presents NeuralSparse, a supervised graph sparsification technique that improves generalization power by learning to remove potentially task-irrelevant edges from input graphs and takes both structural and non-structural information as input. Expand
Data Augmentation for Graph Convolutional Network on Semi-Supervised Classification
TLDR
This paper proposes an attentional integrating model to weighted sum the hidden node embeddings encoded by these GCNs into the final node embeddeddings, and conducts cosine similarity based cross operation on the original features to create new graph features, including new node attributes and new graph topologies. Expand
Revisiting Graph Neural Networks: All We Have is Low-Pass Filters
TLDR
The results indicate that graph neural networks only perform low-pass filtering on feature vectors and do not have the non-linear manifold learning property, and some insights on GCN-based graph neural network design are proposed. Expand
Graphs, Entities, and Step Mixture
TLDR
A new graph neural network that considers both edge-based neighborhood relationships and node-based entity features, i.e. Graph Entities with Step Mixture via random walk (GESM), which achieves state-of-the-art or comparable performances on eight benchmark graph datasets comprising transductive and inductive learning tasks. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 29 REFERENCES
Hierarchical Graph Representation Learning with Differentiable Pooling
TLDR
DiffPool is proposed, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion. Expand
N-GCN: Multi-scale Graph Convolution for Semi-supervised Node Classification
TLDR
The proposed N-GCN model improves state-of-the-art baselines on all of the challenging node classification tasks the authors consider: Cora, Citeseer, Pubmed, and PPI, and has other desirable properties, including generalization to recently proposed semi-supervised learning methods such as GraphSAGE, and resilience to adversarial input perturbations. Expand
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
TLDR
This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs. Expand
Higher-order Graph Convolutional Networks
TLDR
This work proposes a motif-based graph attention model, called Motif Convolutional Networks (MCNs), which generalizes past approaches by using weighted multi-hop motif adjacency matrices to capture higher-order neighborhoods. Expand
Spectral Networks and Locally Connected Networks on Graphs
TLDR
This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian. Expand
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of priorExpand
Semi-Supervised Classification with Graph Convolutional Networks
TLDR
A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin. Expand
MorphNet: Fast & Simple Resource-Constrained Structure Learning of Deep Networks
TLDR
MorphNet iteratively shrinks and expands a network, shrinking via a resource-weighted sparsifying regularizer on activations and expanding via a uniform multiplicative factor on all layers, which is scalable to large networks, adaptable to specific resource constraints, and capable of increasing the network's performance. Expand
Diffusion-Convolutional Neural Networks
TLDR
Through the introduction of a diffusion-convolution operation, it is shown how diffusion-based representations can be learned from graph-structured data and used as an effective basis for node classification. Expand
Watch Your Step: Learning Node Embeddings via Graph Attention
TLDR
This paper proposes a novel attention model on the power series of the transition matrix, which guides the random walk to optimize an upstream objective and improves state-of-the-art results on a comprehensive suite of real-world graph datasets including social, collaboration, and biological networks. Expand
...
1
2
3
...