Chunking mechanisms in human learning

  title={Chunking mechanisms in human learning},
  author={Fernand R. Gobet and Peter Lane and Steve Croker and Peter C.-H. Cheng and Julian M. Pine},
  journal={Trends in Cognitive Sciences},

Figures from this paper

The Evolution of Chunks in Sequence Learning
Two chunk reorganization mechanisms are identified: the recombination of preexisting chunks and the concatenation of two distinct chunks into a single one that evolve during the course of the experiment, with chunks becoming progressively fewer and longer.
Towards a model of expectation-driven perception
Simulations with the CHREST model using semantic associations, such as those illustrated in, describe the creation of links between visual and verbal information, and the role of heuristics to guide the simulated eye.
This study tested some predictions of chunk-based theories of expertise by using fMRI while chessplayers performed a recognition memory task and predicted that the recognition memory tasks would activate working memory areas in the frontal and parietal lobes.
In search of templates
Attention Mechanisms in the CHREST Cognitive Architecture
The attention mechanisms in CHREST, a computational architecture of human visual expertise, are described and some experimental evidence demonstrating the correspondence of CHREST's perceptual mechanisms with those of human subjects is described.
Forming Concepts of Mozart and Homer Using Short-Term and Long-Term Memory: A Computational Model Based on Chunking
The proposed model inherits CHREST’s architecture with its integrated STM/LTM stores, while also adding a moving attention window and an “LTM chunk activation” mechanism to address the overly destructive nature of primacy effect in discrimination network based architectures and expand Chunking Theory to account for learning, retrieval and categorisation of complex sequential symbolic patterns.
Learning in the Chrest Cognitive Architecture Theoretical Background
CHREST is a cognitive architecture that closely simulates learning and the acquisition of expertise in humans, with an emphasis on bounded rationality, a close link between perception, learning, memory, and decision making, and the use of naturalistic data as input for learning.
MDLChunker: A MDL-Based Cognitive Model of Inductive Learning
Results show that the designed model makes precise quantitative predictions both on the kind of chunks created by the participants and also on the moment at which these creations occur, and it is suggested that the simplicity principle used to design MDLChunker is particularly efficient to model chunking mechanisms.
Statistically Induced Chunking Recall: A Memory-Based Approach to Statistical Learning
This work developed a novel paradigm to test statistical learning by leveraging a robust phenomenon observed in serial recall tasks: that short-term memory is fundamentally shaped by long-term distributional learning, and demonstrates that SICR effectively captures learning in both the auditory and visual modalities.
Sequence learning recodes cortical representations instead of strengthening initial ones
It is shown that associative learning can theoretically store only very limited number of overlapping sequences, such as common in ecological working memory tasks, and hence an efficient learner should recode initial sequence representations.


Expert chess memory: revisiting the chunking hypothesis.
It is concluded that the two-second inter-chunk interval used to define chunk boundaries is robust, and that chunks have psychological reality.
How Big Is a Chunk?
It is shown that, by viewing experimentation in a parameter-estimating paradigm instead of a hypothesis-testing paradigm, one can obtain much more information from experiments—information that, combined with contemporary theoretical models of the cognitive processes, has implications for human performance on tasks quite different from those of the original experiments.
Learning novel sound patterns
An EPAM based computational model which implements the phonological loop within EPAM represents a parsimonious approach to learning novel sound patterns and provides a more precise definition of how vocabulary acquisition may occur.
The simulation of verbal learning behavior
  • E. Feigenbaum
  • Psychology, Computer Science
    IRE-AIEE-ACM '61 (Western)
  • 1961
The EPAM program is the precise statement of an information processing theory of verbal learning that provides an alternative to other verbal learning theories which have been proposed, and is the result of an attempt to state quite precisely a parsimonious and plausible mechanism sufficient to account for the rote learning of nonsense syllables.
Information-processing analysis of perceptual processes in problem solving.
The theory is particularized in a computer program to simulate the eye movements of subjects choosing a move in chess, and its consistency is shown with data on memory of chess positions and with existing knowledge of short-term memory parameters.
Some shortcomings of long-term working memory.
  • F. Gobet
  • Psychology
    British journal of psychology
  • 2000
It is argued that Ericsson and Kintsch's concept of retrieval structure conflates three different types of memory structures that possess quite different properties, and one of these types of structures--generic, general purpose retrieval structures--has a narrower use than proposed and applies only in domains where there is a conscious, deliberate intent by individuals to improve their memory.
A computer model of chess memory
This paper presents a model which shares several common concepts with an earlier attempt, but features several new attributes: dynamic short-term memory, recursive chunking, more sophisticated perceptual mechanisms and use of a retrieval structure.
Simulation of expert memory using EPAM IV.
EPAM IV reproduces all of the phenomena explained previously by EPAM III and in addition gives an accurate detailed account of the performance of an expert recalling long sequences of digits.