• Corpus ID: 303894

Hierarchical Reasoning with Distributed Vector Representations

@inproceedings{Kommers2015HierarchicalRW,
  title={Hierarchical Reasoning with Distributed Vector Representations},
  author={Cody Kommers and Volkan Ustun and Abram Demski and Paul S. Rosenbloom},
  booktitle={CogSci},
  year={2015}
}
We demonstrate that distributed vector representations are capable of hierarchical reasoning by summing sets of vectors representing hyponyms (subordinate concepts) to yield a vector that resembles the associated hypernym (superordinate concept). These distributed vector representations constitute a potentially neurally plausible model while demonstrating a high level of performance in many different cognitive tasks. Experiments were run using DVRS, a word embedding system designed for the… 

Figures and Tables from this paper

Representing Sets as Summed Semantic Vectors
A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges
TLDR
Existing applications, the role of HDC/VSA in cognitive computing and architectures, as well as directions for future work are surveyed, most of the applications lie within the machine learning/artificial intelligence domain, however there are also other applications to provide a thorough picture.
Automatic taxonomy construction from textual documents
TLDR
This thesis proposes an effective framework for automatic domain-specific taxonomy construction from textual documents, which consists of three steps, namely domain term extraction, taxonomic relation identification and taxonomy induction and uses the big data approach which combines linguistics, statistical and deep learning methods to address the challenges.

References

SHOWING 1-10 OF 38 REFERENCES
Distributed Vector Representations of Words in the Sigma Cognitive Architecture
TLDR
A new algorithm for learning distributed-vector word representations from large, shallow information resources, and how this algorithm can be implemented via small modifications to Sigma is described.
Vector Symbolic Architectures: A New Building Material for Artificial General Intelligence
TLDR
Examples applications from opposite ends of the AI spectrum --visual map-seeking circuits and structured analogy processing -- attest to the generality and power of the VSA approach in building new solutions for AI.
Composition in Distributional Models of Semantics
TLDR
This article proposes a framework for representing the meaning of word combinations in vector space in terms of additive and multiplicative functions, and introduces a wide range of composition models that are evaluated empirically on a phrase similarity task.
Distributed Representations of Words and Phrases and their Compositionality
TLDR
This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling.
Supporting inferences in semantic space: representing words as regions
TLDR
This paper presents a model for learning a region representation for word meaning in semantic space, based on the fact that points at close distance tend to represent similar meanings, and shows that this model can be used to predict, with high precision, when a hyponymy-based inference rule is applicable.
Representing words as regions in vector space
TLDR
This paper discusses two models that represent word meaning as regions in vector space and finds that both models perform at over 95% F-score on a token classification task.
From Vectors to Symbols to Cognition: The Symbolic and Sub-Symbolic Aspects of Vector-Symbolic Cognitive Models
TLDR
It is argued that cognitive models implemented in vector-symbolic architectures (VSAs) intrinsically operate at both of levels and thus provide a needed bridge to achieve a full, theoretical understanding of a cognitive process.
Semantic feature production norms for a large set of living and nonliving things
TLDR
A set of feature norms collected from approximately 725 participants for 541 living (dog) and nonliving (chair) basic-level concepts, the largest such set of norms developed to date are described, making these norms available to facilitate other research, while obviating the need to repeat the labor-intensive methods involved in collecting and analyzing such norms.
A Neurally Plausible Encoding of Word Order Information into a Semantic Vector Space
TLDR
A simplified version of convolution encoding that can replicate many of the important functional properties of Jones and Mewhort’s (2007) method is developed and concluded that this new encoding method is a more neurally plausible alternative than its predecessors.
Reasoning With Neural Tensor Networks for Knowledge Base Completion
TLDR
An expressive neural tensor network suitable for reasoning over relationships between two entities given a subset of the knowledge base is introduced and performance can be improved when entities are represented as an average of their constituting word vectors.
...
1
2
3
4
...