Symbolic Representation and Learning With Hyperdimensional Computing

  title={Symbolic Representation and Learning With Hyperdimensional Computing},
  author={Anton Mitrokhin and Peter Sutor and Douglas Summers-Stay and Cornelia Ferm{\"u}ller and Yiannis Aloimonos},
  journal={Frontiers in Robotics and AI},
It has been proposed that machine learning techniques can benefit from symbolic representations and reasoning systems. We describe a method in which the two can be combined in a natural and direct way by use of hyperdimensional vectors and hyperdimensional computing. By using hashing neural networks to produce binary vector representations of images, we show how hyperdimensional vectors can be constructed such that vector-symbolic inference arises naturally out of their output. We design the… Expand
4 Citations
A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations
This two-part comprehensive survey is devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA). Both names refer toExpand
Spiking Hyperdimensional Network: Neuromorphic Models Integrated with Memory-Inspired Framework
SpikeHD is proposed, the first framework that fundamentally combines Spiking neural network and hyperdimensional computing and generates a scalable and strong cognitive learning system that better mimics brain functionality. Expand
Robust high-dimensional memory-augmented neural networks
This work proposes a robust architecture that employs a computational memory unit as the explicit memory performing analog in-memory computation on high-dimensional (HD) vectors, while closely matching 32-bit software-equivalent accuracy. Expand
Scalable edge-based hyperdimensional learning system with brain-like neural adaptation
Inspired by human neural regeneration study in neuroscience, NeuralHD identifies insignificant dimensions and regenerates those dimensions to enhance the learning capability and robustness and presents a scalable learning framework to distribute NeuralHD computation over edge devices in IoT systems. Expand


Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors
  • P. Kanerva
  • Computer Science
  • Cognitive Computation
  • 2009
The thesis of the paper is that hyperdimensional representation has much to offer to students of cognitive science, theoretical neuroscience, computer science and engineering, and mathematics. Expand
GloVe: Global Vectors for Word Representation
A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure. Expand
High-Dimensional Computing as a Nanoscalable Paradigm
We outline a model of computing with high-dimensional (HD) vectors—where the dimensionality is in the thousands. It is built on ideas from traditional (symbolic) computing and artificial neuralExpand
Representing Sets as Summed Semantic Vectors
It is shown how a technique built to aid sparse vector decomposition allows in many cases the exact recovery of the inputs and weights to such a sum, allowing a single vector to represent an entire set of vectors from a dictionary. Expand
Classification and Recall With Binary Hyperdimensional Computing: Tradeoffs in Choice of Density and Mapping Characteristics
Tradeoffs of selecting parameters of binary HD representations when applied to pattern recognition tasks are discussed and the capacity of representations of various densities is discussed. Expand
Symbolic Computation Using Cellular Automata-Based Hyperdimensional Computing
This letter introduces a novel framework of reservoir computing that is capable of both connectionist machine intelligence and symbolic computation, and suggests that binary reservoir feature vectors can be combined using Boolean operations as in hyperdimensional computing. Expand
Efficient Estimation of Word Representations in Vector Space
Two novel model architectures for computing continuous vector representations of words from very large data sets are proposed and it is shown that these vectors provide state-of-the-art performance on the authors' test set for measuring syntactic and semantic word similarities. Expand
Hierarchical Hyperdimensional Computing for Energy Efficient Classification
This paper proposes MHD, a multi-encoder hierarchical classifier, which enables HD to take full advantages of multiple encoders without increasing the cost of classification and test the accuracy/efficiency of the proposed MHD on speech recognition application. Expand
Learning Multiple Layers of Features from Tiny Images
It is shown how to train a multi-layer generative model that learns to extract meaningful features which resemble those found in the human visual cortex, using a novel parallelization algorithm to distribute the work among multiple machines connected on a network. Expand
Metaconcepts: Isolating Context in Word Embeddings
It is demonstrated that a technique for directly computing new vectors representing multiple words in a way that naturally combines them into a new, more consistent space where distance better correlates to similarity works well for natural language. Expand