# Symbolic Representation and Learning With Hyperdimensional Computing

@article{Mitrokhin2020SymbolicRA, title={Symbolic Representation and Learning With Hyperdimensional Computing}, author={Anton Mitrokhin and Peter Sutor and Douglas Summers-Stay and Cornelia Ferm{\"u}ller and Yiannis Aloimonos}, journal={Frontiers in Robotics and AI}, year={2020}, volume={7} }

It has been proposed that machine learning techniques can benefit from symbolic representations and reasoning systems. We describe a method in which the two can be combined in a natural and direct way by use of hyperdimensional vectors and hyperdimensional computing. By using hashing neural networks to produce binary vector representations of images, we show how hyperdimensional vectors can be constructed such that vector-symbolic inference arises naturally out of their output. We design the…

## 5 Citations

A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations

- Computer ScienceArXiv
- 2021

This two-part comprehensive survey is devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA). Both names refer to…

A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges

- Computer ScienceArXiv
- 2021

Existing applications, the role of HDC/VSA in cognitive computing and architectures, as well as directions for future work are surveyed, most of the applications lie within the machine learning/artificial intelligence domain, however there are also other applications to provide a thorough picture.

Spiking Hyperdimensional Network: Neuromorphic Models Integrated with Memory-Inspired Framework

- Computer ScienceArXiv
- 2021

SpikeHD is proposed, the first framework that fundamentally combines Spiking neural network and hyperdimensional computing and generates a scalable and strong cognitive learning system that better mimics brain functionality.

Robust high-dimensional memory-augmented neural networks

- Computer Science, MedicineNature communications
- 2021

This work proposes a robust architecture that employs a computational memory unit as the explicit memory performing analog in-memory computation on high-dimensional (HD) vectors, while closely matching 32-bit software-equivalent accuracy.

Scalable edge-based hyperdimensional learning system with brain-like neural adaptation

- Computer ScienceSC
- 2021

Inspired by human neural regeneration study in neuroscience, NeuralHD identifies insignificant dimensions and regenerates those dimensions to enhance the learning capability and robustness and presents a scalable learning framework to distribute NeuralHD computation over edge devices in IoT systems.

## References

SHOWING 1-10 OF 29 REFERENCES

Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors

- Computer ScienceCognitive Computation
- 2009

The thesis of the paper is that hyperdimensional representation has much to offer to students of cognitive science, theoretical neuroscience, computer science and engineering, and mathematics.

GloVe: Global Vectors for Word Representation

- Computer ScienceEMNLP
- 2014

A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure.

High-Dimensional Computing as a Nanoscalable Paradigm

- Computer ScienceIEEE Transactions on Circuits and Systems I: Regular Papers
- 2017

We outline a model of computing with high-dimensional (HD) vectors—where the dimensionality is in the thousands. It is built on ideas from traditional (symbolic) computing and artificial neural…

Representing Sets as Summed Semantic Vectors

- Computer ScienceBiologically Inspired Cognitive Architectures
- 2018

It is shown how a technique built to aid sparse vector decomposition allows in many cases the exact recovery of the inputs and weights to such a sum, allowing a single vector to represent an entire set of vectors from a dictionary.

Classification and Recall With Binary Hyperdimensional Computing: Tradeoffs in Choice of Density and Mapping Characteristics

- Computer Science, MedicineIEEE Transactions on Neural Networks and Learning Systems
- 2018

Tradeoffs of selecting parameters of binary HD representations when applied to pattern recognition tasks are discussed and the capacity of representations of various densities is discussed.

Symbolic Computation Using Cellular Automata-Based Hyperdimensional Computing

- Computer Science, MedicineNeural Computation
- 2015

This letter introduces a novel framework of reservoir computing that is capable of both connectionist machine intelligence and symbolic computation, and suggests that binary reservoir feature vectors can be combined using Boolean operations as in hyperdimensional computing.

Efficient Estimation of Word Representations in Vector Space

- Computer ScienceICLR
- 2013

Two novel model architectures for computing continuous vector representations of words from very large data sets are proposed and it is shown that these vectors provide state-of-the-art performance on the authors' test set for measuring syntactic and semantic word similarities.

Hierarchical Hyperdimensional Computing for Energy Efficient Classification

- Computer Science2018 55th ACM/ESDA/IEEE Design Automation Conference (DAC)
- 2018

This paper proposes MHD, a multi-encoder hierarchical classifier, which enables HD to take full advantages of multiple encoders without increasing the cost of classification and test the accuracy/efficiency of the proposed MHD on speech recognition application.

Learning Multiple Layers of Features from Tiny Images

- Computer Science
- 2009

It is shown how to train a multi-layer generative model that learns to extract meaningful features which resemble those found in the human visual cortex, using a novel parallelization algorithm to distribute the work among multiple machines connected on a network.

Metaconcepts: Isolating Context in Word Embeddings

- Computer Science2019 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR)
- 2019

It is demonstrated that a technique for directly computing new vectors representing multiple words in a way that naturally combines them into a new, more consistent space where distance better correlates to similarity works well for natural language.