A neurobehavioral model of flexible spatial language behaviors.

@article{Lipinski2012ANM,
  title={A neurobehavioral model of flexible spatial language behaviors.},
  author={John Lipinski and Sebastian Schneegans and Yulia Sandamirskaya and John P. Spencer and Gregor Sch{\"o}ner},
  journal={Journal of experimental psychology. Learning, memory, and cognition},
  year={2012},
  volume={38 6},
  pages={
          1490-511
        }
}
We propose a neural dynamic model that specifies how low-level visual processes can be integrated with higher level cognition to achieve flexible spatial language behaviors. This model uses real-word visual input that is linked to relational spatial descriptions through a neural mechanism for reference frame transformations. We demonstrate that the system can extract spatial relations from visual scenes, select items based on relational spatial descriptions, and perform reference object… 
A neural dynamic model for the perceptual grounding of spatial and movement relations
TLDR
This model explains how sequences of decisions emerge from the timeand state-continuous neural dynamics, how relational hypotheses are generated and either accepted or rejected, followed by the selection of new objects or the generation of new relational hypotheses.
A Neural Dynamic Model of the Perceptual Grounding of Spatial and Movement Relations
TLDR
This model explains how sequences of decisions emerge from the time- and state-continuous neural dynamics, how relational hypotheses are generated and either accepted or rejected, followed by the selection of new objects or the generation of new relational hypotheses.
A Neural Dynamic Architecture Resolves Phrases about Spatial Relations in Visual Scenes
TLDR
An autonomous neural dynamics that achieves this mapping flexibly of spatial language by autonomously generating a response based on visual input about a scene is presented.
A Neurodynamic Architecture for the Autonomous Control of a Spatial Language System
TLDR
A general account of how complex cognitive tasks tha t flexibly combine elementary operations can be controlled i n a neurally grounded architecture and the integration of the two systems is presented.
Grounding Spatial Language in Perception by Combining Concepts in a Neural Dynamic Architecture
TLDR
A neural dynamic architecture that grounds sentences in perception which combine multiple concepts through nested spatial relations that autonomously generates sequences of processing steps in continuous time, based solely on highly recurrent connectivity.
A Neural Dynamic Model Generates Descriptions of Object-Oriented Actions
TLDR
This work demonstrates how the dynamic neural processes autonomously generate the processing steps required to describe or ground object-oriented actions in a neural dynamic model that comprises dynamic neural fields holding representations close to sensorimotor surfaces as well as dynamic neural nodes holding discrete, language-like representations.
A neural-dynamic architecture for flexible spatial language: Intrinsic frames, the term “between”, and autonomy
TLDR
This work introduces autonomous selection between viewer-centered and intrinsic reference frames, and enhances the autonomy of the system so that the required sequence of attentional shifts, coordinate transforms, and selection decisions emerges from the time-continuous neural dynamics.
How do neural processes give rise to cognition? Simultaneously predicting brain and behavior with a dynamic model of visual working memory.
TLDR
In an exemplary study of visual working memory, multilevel Bayesian statistics are used to demonstrate that a neural dynamic model simultaneously explains behavioral data and predicts localized patterns of brain activity, outperforming standard analytic approaches to fMRI.
Dynamic interactions between visual working memory and saccade target selection.
TLDR
A dynamic neural field model is presented that captures the neural processes underlying visual perception, working memory, and saccade planning relevant to the psychophysical experiment, and its predictions were empirically confirmed in a new experiment: Memory for a sample color was biased toward the color of a task-irrelevant saccades target object, demonstrating the bidirectional coupling between visual working memory and perceptual processing.
...
...

References

SHOWING 1-10 OF 101 REFERENCES
A Dynamic Neural Field Model of Visual Working Memory and Change Detection
TLDR
A layered neural architecture is described that implements encoding and maintenance, and links these processes to a plausible comparison process that makes the novel prediction that change detection will be enhanced when metrically similar features are remembered.
Representing spatial relationships in posterior parietal cortex: single neurons code object-referenced position.
TLDR
Recorded neural activity in parietal area 7a of monkeys performing an object construction task found that neurons were activated as a function of the spatial relationship between a task-critical coordinate and a reference object.
Toward a formal theory of flexible spatial behavior: geometric category biases generalize across pointing and verbal response types.
TLDR
Data suggest that the same spatial memory process gives rise to both response types in this task, and challenge accounts that posit separate spatial systems for motor and verbal responses.
A computational perspective on the neural basis of multisensory spatial representations
TLDR
It is argued that current theories of multisensory representations are inconsistent with the existence of a large proportion of multimodal neurons with gain fields and partially shifting receptive fields, and an alternative theory, developed and reviewed here, has important implications for the idea of 'frame of reference' in neural spatial representations.
Corresponding delay-dependent biases in spatial language and spatial memory
TLDR
It is shown that additional perceptual structure along the vertical axis reduces delay-dependent effects in both tasks, indicating that linguistic and non-linguistic spatial systems depend on shared underlying representational processes.
Attention Unites Form and Function in Spatial Language
TLDR
It is argued that preferential attention to the functional parts of objects may explain effects of object function on the interpretation of spatial terms, and is shown empirically and computationally using an attentional model of spatial language.
Separate visual pathways for perception and action
...
...