An Extension to Basis-Hypervectors for Learning from Circular Data in Hyperdimensional Computing

@article{Nunes2022AnET,
  title={An Extension to Basis-Hypervectors for Learning from Circular Data in Hyperdimensional Computing},
  author={Igor O. Nunes and Mike Heddes and Tony Givargis and Alexandru Nicolau},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.07920}
}
Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces. It is particularly useful for machine learning in resource-constrained environments, such as embedded systems and IoT, as it achieves a good balance between accuracy, efficiency and robustness. The mapping of information to the hyperspace, named encoding , is the most important stage in HDC. At its heart are basis-hypervectors , responsible for representing the smallest units of… 
1 Citations

Figures and Tables from this paper

Torchhd: An Open-Source Python Library to Support Hyperdimensional Computing Research
Hyperdimensional Computing (HDC) is a neuro-inspired computing framework that exploits high-dimensional random vector spaces. HDC uses extremely parallelizable arithmetic to provide computational…

References

SHOWING 1-10 OF 41 REFERENCES
Classification Using Hyperdimensional Computing: A Review
TLDR
Evaluations indicate that HD computing shows great potential in addressing problems using data in the form of letters, signals and images, and shows significant promise to replace machine learning algorithms as a light-weight classifier in the field of internet of things (IoTs).
Theoretical Foundations of Hyperdimensional Computing
TLDR
This work presents a unified treatment of the theoretical foundations of HD computing with a focus on the suitability of representations for learning and provides useful guidance for practitioners and lays out important open questions warranting further study.
GraphHD: Efficient graph classification using hyperdimensional computing
TLDR
The results show that when compared to the state-of-the-art Graph Neural Networks (GNNs) the proposed model achieves comparable accuracy, while training and inference times are on average $14.6\times$ and $2.0 \times$ faster, respectively.
Performance Analysis of Hyperdimensional Computing for Character Recognition
TLDR
The effect of dimensionality (and therefore energy) in the performance of HDC, done through a character recognition application, was explored and results show that a dimensionality of 4,000-bit one-shot learning system yields an average accuracy close to that of its 12,000 -bit counterpart at 0% distortion.
Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors
  • P. Kanerva
  • Computer Science
    Cognitive Computation
  • 2009
TLDR
The thesis of the paper is that hyperdimensional representation has much to offer to students of cognitive science, theoretical neuroscience, computer science and engineering, and mathematics.
A Robust and Energy-Efficient Classifier Using Brain-Inspired Hyperdimensional Computing
TLDR
A hardware architecture for a hypervector-based classifier that is able to tolerate 8.8-fold probability of failure of memory cells while maintaining 94% accuracy and can significantly improve energy efficiency is described.
A random walk in Hamming space
  • D. Smith, P. Stanford
  • Mathematics, Computer Science
    1990 IJCNN International Joint Conference on Neural Networks
  • 1990
TLDR
The scatter code is described, a new technique for mapping sensor data into a form suitable for processing by associative neural networks that has a practically limitless number of points and the ability to control the radius of association in the mapped sensor data.
Integrating event-based dynamic vision sensors with sparse hyperdimensional computing: a low-power accelerator with online learning capability
TLDR
This work proposes to embed features extracted from event-driven dynamic vision sensors to binary sparse representations in hyperdimensional (HD) space for regression using estimates and confidences of an initial model trained with only 25% of data, resulting in a close match with accuracy obtained with an oracle model on ground truth labels.
Recent advances in directional statistics
TLDR
This paper provides a review of the many recent developments in the field since the publication of Mardia and Jupp (1999), still the most comprehensive text on directional statistics, and considers developments for the exploratory analysis of directional data.
Sparse Distributed Memory
TLDR
Pentti Kanerva's Sparse Distributed Memory presents a mathematically elegant theory of human long term memory that resembles the cortex of the cerebellum, and provides an overall perspective on neural systems.
...
...