• Corpus ID: 9279515

Towards efficiently supporting large symbolic declarative memories

@inproceedings{Derbinsky2010TowardsES,
  title={Towards efficiently supporting large symbolic declarative memories},
  author={Nate Derbinsky and John E. Laird and Bryan Smith},
  year={2010}
}
Efficient access to large declarative memories is one challenge in the development of large-scale cognitive models. Prior work has provided an initial demonstration of declarative retrievals using ACT-R and a relational database. In this paper, we provide extended analysis of the computational challenges involved. We detail data structures and algorithms for an efficient mechanism over a large set of retrievals, as well as for a class of activation bias. We have implemented this work in Soar… 

Figures and Tables from this paper

Performance evaluation of declarative memory systems in Soar
TLDR
This paper evaluates the declarative memories of Soar: working memory, semantic memory, and episodic memory, using a detailed simulation of a mobile robot running for one hour of real-time, indicating that the implementation is sufficient for tasks of this length.
Functional Interactions Between Memory and Recognition Judgments
TLDR
It is demonstrated that the recognition judgment — a heuristic for whether memory structures have been previously perceived — can serve as a low-cost indicator of the existence of potentially relevant knowledge.
A Functional Analysis of Historical Memory Retrieval Bias in the Word Sense Disambiguation Task
TLDR
This paper evaluates the functional benefit of a set of memory retrieval heuristics that incorporate human memory retrieval biases, in the context of the word sense disambiguation task, in which an agent must identify the most appropriate word meaning in response to an ambiguous linguistic cue.
A preliminary functional analysis of memory in the word sense disambiguation task
TLDR
The functional benefit of two forms of memory retrieval bias, recency and frequency of memory access, are demonstrated, and a preliminary evaluation of heuristics to efficiently support these biases in memory systems are presented.
Characterizing an Analogical Concept Memory for Newellian Cognitive Architectures
TLDR
A new long-term declarative memory for Soar that leverages the computational models of analogical reasoning and generalization and it is demonstrated that the learning methods implemented in the proposed memory can quickly learn a diverse types of novel concepts that are useful in task execution.
Efficient Computation of Spreading Activation Using Lazy Evaluation
TLDR
The final model uses lazy evaluation to avoid much of the computation normally associated with spreading activation, achieving an average time of 0.43ms per query for a spread to 300 nodes.
Efficient Computation of Spreading Activation Using Lazy Evaluation
TLDR
The final model uses lazy evaluation to avoid much of the computation normally associated with spreading activation and is significantly faster than a naive model, achieving an average time of 0.43ms per query for a spread to 300 nodes.
Effective and Efficient Management of Soar's Working Memory via Base-Level Activation
TLDR
This paper documents a functionality-driven exploration of automatic working-memory management in Soar and presents empirical results, which demonstrate both that the mechanism performs with little computational overhead and that it helps maintain the reactivity of a Soar agent contending with long-term, autonomous simulated robotic exploration as it reasons using large amounts of acquired information.
Spontaneous Retrieval from Long-Term Memory for a Cognitive Architecture
TLDR
Empirical evidence is provided in the Missing Link word-puzzle domain, where agents using spontaneous retrieval out-perform agents without that capability, leading to the conclusion that spontaneous retrieval can be a useful mechanism and is worth further exploration.
Extending Semantic and Episodic Memory to Support Robust Decision Making
  • J. Laird
  • Computer Science, Psychology
  • 2013
TLDR
For episodic memory, research has led to significant improvements in the efficiency of storage and retrieval through the exploitation of temporal contiguity, structural regularity, high cue structural selectivity, high temporal selectivity and low cue feature co-occurrence.
...
...

References

SHOWING 1-10 OF 11 REFERENCES
Large Declarative Memories in ACT-R
TLDR
This paper motivates and describes the interfacing of theACT-R cognitive architecture with a relational database to support large declarative memories within ACT-R models.
Extending the Soar Cognitive Architecture
TLDR
This paper presents the cognitive architecture approach to general intelligence and the traditional, symbolic Soar architecture, followed by major additions to Soar: non-symbolic representations, new learning mechanisms, and long-term memories.
An integrated theory of the mind.
TLDR
The perceptual-motor modules, the goal module, and the declarative memory module are presented as examples of specialized systems in ACT-R, which consists of multiple modules that are integrated to produce coherent cognition.
Production Matching for Large Learning Systems
TLDR
An improved match algorithm, Rete/UL, is presented, which is a general extension of the existing state-of-the-art Rete match algorithm and scales well on a significantly broader class of systems than existing match algorithms.
WordNet: A Lexical Database for English
TLDR
WordNet1 provides a more effective combination of traditional lexicographic information and modern computing, and is an online lexical database designed for use under program control.
CYC: a large-scale investment in knowledge infrastructure
TLDR
The fundamental assumptions of doing such a large-scale project are examined, the technical lessons learned by the developers are reviewed, and the range of applications that are or soon will be enabled by the technology is surveyed.
A combination of trie-trees and inverted files for the indexing of set-valued attributes
TLDR
The proposed index superimposes a trie-tree on top of an inverted file that indexes a relation with set-valued data and shows that it can efficiently answer superset, subset and equality queries by indexing only a subset of the most frequent of the items that occur in the indexed relation.
Towards a standard upper ontology
TLDR
The strategy used to create the current version of the SUMO is outlined, some of the challenges that were faced in constructing the ontology are discussed, and its most general concepts and the relations between them are described.
Figure 2: Synthetic failure sweep results
  • Figure 2: Synthetic failure sweep results
Figure 1: Synthetic cue sweep results
  • Figure 1: Synthetic cue sweep results
...
...