Task and context determine where you look.

@article{Rothkopf2007TaskAC,
  title={Task and context determine where you look.},
  author={Constantin A. Rothkopf and Dana H. Ballard and Mary M. Hayhoe},
  journal={Journal of vision},
  year={2007},
  volume={7 14},
  pages={
          16.1-20
        }
}
The deployment of human gaze has been almost exclusively studied independent of any specific ongoing task and limited to two-dimensional picture viewing. This contrasts with its use in everyday life, which mostly consists of purposeful tasks where gaze is crucially involved. To better understand deployment of gaze under such circumstances, we devised a series of experiments, in which subjects navigated along a walkway in a virtual environment and executed combinations of approach and avoidance… 
Modelling Task-Dependent Eye Guidance to Objects in Pictures
We introduce a model of attentional eye guidance based on the rationale that the deployment of gaze is to be considered in the context of a general action-perception loop relying on two strictly
Control of gaze while walking: Task structure, reward, and uncertainty
While it is universally acknowledged that both bottom up and top down factors contribute to allocation of gaze, we currently have limited understanding of how top-down factors determine gaze choices
Eye movements reveal spatiotemporal dynamics of visually-informed planning in navigation
TLDR
The findings suggest that the spatiotemporal characteristics of eye movements during navigation are significantly shaped by the unique cognitive computations underlying real-world, sequential decision making.
Adaptive Gaze Strategies to Reduce Environmental Uncertainty During a Sequential Visuomotor Behaviour
TLDR
It is shown that gaze is allocated to reduce uncertainty about target locations, and this depends on the value of this information gain for successful task performance, and that the spatial-temporal pattern of gaze to resolve uncertainty changes with the evolution of the motor behaviour, indicating a flexible strategy to plan and control movement.
Models of gaze control for manipulation tasks
TLDR
This work presents new computational models of gaze shifting, where the agent imagines ahead in time the informational effects of possible gaze fixations, and compares the hand-eye coordination timings of the models in a robot simulation to those obtained from human data.
Gaze control for visually guided manipulation
TLDR
Three computational models of gaze shifting are formulated and characterised, which use lookahead to imagine the informational effects of possible gaze fixations, and evidence is provided that only the models that incorporate both uncertainty and reward match human data is provided.
Gaze Behavior in a Natural Environment with a Task-Relevant Distractor: How the Presence of a Goalkeeper Distracts the Penalty Taker
TLDR
Investigating where participants direct their gaze in a natural environment with multiple potential fixation targets that differ in task relevance and salience showed that the early phase of the run-up seems to be driven by both the salience of the stimulus setting and the need to perform a spatial calibration of the environment.
Gaze control and memory for objects while walking in a real world environment
Assessments of gaze behaviour and object memory are typically done in the context of experimental paradigms briefly presenting transient static images of synthetic or real scenes. Less is known about
Eye guidance in natural vision: reinterpreting salience.
TLDR
It is argued that there is a need to move away from this class of model and find the principles that govern gaze allocation in a broader range of settings, because the stimulus context is limited, and the dynamic, task-driven nature of vision is not represented.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 95 REFERENCES
The Roles of Vision and Eye Movements in the Control of Activities of Daily Living
TLDR
Although the actions of tea-making are ‘automated’ and proceed with little conscious involvement, the eyes closely monitor every step of the process, suggesting that this type of unconscious attention must be a common phenomenon in everyday life.
Oculomotor behavior in natural and man-made environments
Object Recognition and Goal-Directed Eye or Hand Movements are Coupled by Visual Attention
TLDR
It is concluded that it is not possible to maintain attention on a stimulus for the purpose of discrimination while preparing a movement to a spatially separate object.
What controls attention in natural environments?
In what ways do eye movements contribute to everyday activities?
What you see is what you need.
TLDR
The experiments suggest a highly purposive and task specific nature of human vision, where information extracted from the fixation point is used for certain computations only "just in time" when needed to solve the current goal.
Visual memory and motor planning in a natural task.
TLDR
This paper investigates the temporal dependencies of natural vision by measuring eye and hand movements while subjects made a sandwich, suggesting that much natural vision can be accomplished with "just-in-time" representations.
Visual motion and attentional capture
TLDR
It is argued that when motion segregates a perceptual element from a perceptual group, a new perceptual object is created, and this event captures attention, suggesting that motion as such does not capture attention but that the appearance of anew perceptual object does.
Eye–Hand Coordination in Object Manipulation
TLDR
The coordination between gaze behavior, fingertip movements, and movements of the manipulated object when subjects reached for and grasped a bar and moved it to press a target-switch is analyzed to conclude that gaze supports hand movement planning by marking key positions to which the fingertips or grasped object are subsequently directed.
...
1
2
3
4
5
...