GraphHD: Efficient graph classification using hyperdimensional computing

@article{Nunes2022GraphHDEG,
  title={GraphHD: Efficient graph classification using hyperdimensional computing},
  author={Igor O. Nunes and Mike Heddes and Tony Givargis and Alexandru Nicolau and Alexander V. Veidenbaum},
  journal={2022 Design, Automation \& Test in Europe Conference \& Exhibition (DATE)},
  year={2022},
  pages={1485-1490}
}
Hyperdimensional Computing (HDC) developed by Kanerva is a computational model for machine learning inspired by neuroscience. HDC exploits characteristics of biological neural systems such as high-dimensionality, randomness and a holographic representation of information to achieve a good balance between accuracy, efficiency and robustness. HDC models have already been proven to be useful in different learning applications, especially in resource-limited settings such as the increasingly… 

Figures and Tables from this paper

An Extension to Basis-Hypervectors for Learning from Circular Data in Hyperdimensional Computing
TLDR
This work proposes an improvement for level-hypervectors, used to encode real numbers, and introduces a method to learn from circular data, an important type of information never before addressed in machine learning with HDC.
Torchhd: An Open-Source Python Library to Support Hyperdimensional Computing Research
Hyperdimensional Computing (HDC) is a neuro-inspired computing framework that exploits high-dimensional random vector spaces. HDC uses extremely parallelizable arithmetic to provide computational

References

SHOWING 1-10 OF 44 REFERENCES
A Comprehensive Survey on Graph Neural Networks
TLDR
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.
Representation Learning on Graphs with Jumping Knowledge Networks
TLDR
This work explores an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation in graphs.
A new model for learning in graph domains
TLDR
A new neural model, called graph neural network (GNN), capable of directly processing graphs, which extends recursive neural networks and can be applied on most of the practically useful kinds of graphs, including directed, undirected, labelled and cyclic graphs.
TUDataset: A collection of benchmark datasets for learning with graphs
TLDR
The TUDataset for graph classification and regression is introduced, which consists of over 120 datasets of varying sizes from a wide range of applications and provides Python-based data loaders, kernel and graph neural network baseline implementations, and evaluation tools.
Weisfeiler-Lehman Graph Kernels
TLDR
A family of efficient kernels for large graphs with discrete node labels based on the Weisfeiler-Lehman test of isomorphism on graphs that outperform state-of-the-art graph kernels on several graph classification benchmark data sets in terms of accuracy and runtime.
The Graph Neural Network Model
TLDR
A new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains, and implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space.
Performance Analysis of Hyperdimensional Computing for Character Recognition
TLDR
The effect of dimensionality (and therefore energy) in the performance of HDC, done through a character recognition application, was explored and results show that a dimensionality of 4,000-bit one-shot learning system yields an average accuracy close to that of its 12,000 -bit counterpart at 0% distortion.
A survey on graph kernels
TLDR
This survey gives a comprehensive overview of techniques for kernel-based graph classification developed in the past 15 years and describes and categorizes graph kernels based on properties inherent to their design, such as the nature of their extracted graph features, their method of computation and their applicability to problems in practice.
A Binary Learning Framework for Hyperdimensional Computing
TLDR
BinHD encodes data points to binary hypervectors and provides a framework which enables HD to perform the training task with significantly low resources and memory footprint and in inference, BinHD binarizes the model and simplifies the costly Cosine similarity used in existing HD computing algorithms to a hardware-friendly Hamming distance metric.
Fast Random Walk Graph Kernel
TLDR
Ark is a set of fast algorithms for random walk graph kernel computation based on the observation that real graphs have much lower intrinsic ranks, compared with the orders of the graphs, which exploits the low rank structure to quickly compute randomWalk graph kernels in O(n) or O(m) time.
...
...