Visual Search Is Modulated by Action Intentions

@article{Bekkering2002VisualSI,
  title={Visual Search Is Modulated by Action Intentions},
  author={Harold Bekkering and Sebastiaan F. W. Neggers},
  journal={Psychological Science},
  year={2002},
  volume={13},
  pages={370 - 374}
}
The influence of action intentions on visual selection processes was investigated in a visual search paradigm. A predefined target object with a certain orientation and color was presented among distractors, and subjects had to either look and point at the target or look at and grasp the target. Target selection processes prior to the first saccadic eye movement were modulated by the different action intentions. Specifically, fewer saccades to objects with the wrong orientation were made in the… 

Figures from this paper

Neural and temporal dynamics underlying visual selection for action.
TLDR
It was found that grasping compared with pointing resulted in a stronger N1 component and a subsequent selection negativity, which were localized to the lateral occipital complex, which suggest that the intention to grasp influences the processing of action-relevant features in ventral stream areas already at an early stage.
Action Properties of Object Images Facilitate Visual Search
TLDR
Results suggest that action properties in images, and constraints for action imposed by preferences for manual interaction with objects, can influence attentional selection in the context of visual search.
Action intention modulates the representation of object features in early visual cortex
TLDR
Results demonstrate that task-specific preparatory signals modulate activity not only in areas typically known to be involved in perception for action, but also in the EVC, and suggest that object features that are relevant for successful action performance are represented in the part of the visual cortex that is best suited to process visual features in great details, such as the foveal cortex, even if the objects are viewed in the periphery.
Faster recognition of graspable targets defined by orientation in a visual search task
TLDR
It is hypothesized that visual stimuli that afford action, which are known to potentiate activity in the dorsal visual stream, would be associated with greater alterations in visual processing when presented near the hand, and that object affordances may also potentiate early visual processes necessary for object recognition.
Selective weighting of action-related feature dimensions in visual working memory
TLDR
Findings reveal that a weighting of information in visual working memory according to action relevance can even be implemented at the representational level during maintenance, demonstrating that the authors' actions continue to influence visual processing beyond the perceptual stage.
Selection-for-action in visual search.
How you move is what you see: action planning biases selection in visual search.
TLDR
An integrative model of visual search that incorporates input from action-planning processes is suggested, which implies that action-related weighting is not independent from task-relevance weighting.
Action Intentions Modulate Allocation of Visual Attention: Electrophysiological Evidence
TLDR
The results showed that the behavioral congruency effects have been reflected by a modulation of the P1 component as well as the N2pc (an ERP marker of spatial attention) which support the argumentation that action planning modulates already early perceptual processing and attention mechanisms.
Object manipulation and motion perception: evidence of an influence of action planning on visual processing.
TLDR
A motor-visual priming effect of prepared object manipulations on visual motion perception is demonstrated, indicating a bidirectional functional link between action and perception beyond object-related visuomotor associations.
Simple action planning can affect attentional allocation in subsequent visual search
TLDR
It was found that the same features of the object were prioritized in the subsequent search task when participants had planned an action response on the object in comparison to when they had not, even when the feature was irrelevant to the tasks or required action.
...
...

References

SHOWING 1-10 OF 31 REFERENCES
Integration of visual and somatosensory target information in goal-directed eye and arm movements
TLDR
A schematic model of sensorimotor transformations for saccadic eye and goal-directed hand movements is proposed and possible shared mechanisms of the two motor systems are discussed.
Interference between saccadic eye and goal-directed hand movements
TLDR
The aim of the present study was to investigate the nature of the interference effect when the eye is accompanied by a goal-directed hand movement rather than when theEye moves alone, and the absence of an interference effect adds weight to the argument that visual spatial attentional mechanisms involved in target localization constitute the locus of the interfered.
Saccade Target Selection During Visual Search
Responses of neurons in inferior temporal cortex during memory-guided visual search.
TLDR
The results support a "biased competition" model of attention, according to which objects in the visual field compete for representation in the cortex, and this competition is biased in favor of the behaviorally relevant object by virtue of "top-down" feedback from structures involved in working memory.
Reaction time latencies of eye and hand movements in single- and dual-task conditions
TLDR
The finding that saccadic eye movements and button-press responses in the dual-task condition could be initiated without delay relative to the single-task conditions, supports the specific interference interpretation.
Guided Search 2.0 A revised model of visual search
  • J. Wolfe
  • Computer Science
    Psychonomic bulletin & review
  • 1994
TLDR
This paper reviews the visual search literature and presents a model of human search behavior, a revision of the guided search 2.0 model in which virtually all aspects of the model have been made more explicit and/or revised in light of new data.
Detection by action: neuropsychological evidence for action-defined templates in search
TLDR
The data suggest that affordances can be effective even when a brain lesion limits the use of other properties in search tasks, and give evidence for a direct pragmatic route from vision to action in the brain.
Neuronal activity in the ventral part of premotor cortex during target-reach movement is modulated by direction of gaze.
TLDR
The data suggest that a part of the coordinate transformation of the motor command signals concerning the direction of reaching from the retinotopic to body-centered frame of reference may occur at the level of premotor cortex but not in MI.
VAM: A neuro-cognitive model for visual attention control of segmentation, object recognition, and space-based motor action
TLDR
A new neuro-cognitive Visual Attention Model, called VAM, is a model of visual attention control of segmentation, object recognition, and space-based motor action that solves the “inter- and intra-object-binding problem”.
...
...