• Corpus ID: 231846371

Predicting Eye Fixations Under Distortion Using Bayesian Observers

@article{Tu2021PredictingEF,
  title={Predicting Eye Fixations Under Distortion Using Bayesian Observers},
  author={Zhengzhong Tu},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.03675}
}
Visual attention is very an essential factor that affects how human perceives visual signals. This report investigates how distortions in an image could distract human’s visual attention using Bayesian visual search models, specifically, Maximum-a-posteriori (MAP) [1] [2] and Entropy Limit Minimization (ELM) [3], which predict eye fixation movements based on a Bayesian probabilistic framework. Experiments on modified MAP and ELM models on JPEG-compressed images containing blocking or ringing… 

Figures from this paper

References

SHOWING 1-10 OF 25 REFERENCES

Optimal eye movement strategies in visual search

This work derives the ideal bayesian observer for search tasks in which a target is embedded at an unknown location within a random background that has the spectral characteristics of natural scenes and finds that humans achieve nearly optimal search performance, even though humans integrate information poorly across fixations.

Visual search in natural scenes: a double-dissociation paradigm for comparing observer models.

The nELM observer is a useful normative model of fixation search under naturalistic conditions and appears to be a good model of human search in natural scenes and develops a strong test for comparing observer models.

Eye movement statistics in humans are consistent with an optimal search strategy.

This work finds that both human and ideal searchers preferentially fixate locations in a donut-shaped region around the center of the circular search area, while MAP searchers distribute their fixations more uniformly, with low density at top and bottom.

Simple summation rule for optimal fixation selection in visual search

Shifts in selective visual attention: towards the underlying neural circuitry.

This study addresses the question of how simple networks of neuron-like elements can account for a variety of phenomena associated with this shift of selective visual attention and suggests a possible role for the extensive back-projection from the visual cortex to the LGN.

Quantifying the Performance Limits of Human Saccadic Targeting during Visual Search

The results demonstrate that ideal-observer analysis can be extended to measure the visual information mediating saccadic target-selection decisions during visual search, which enables direct comparison of saccades and perceptual efficiencies.

A Model of Saliency-Based Visual Attention for Rapid Scene Analysis

A visual attention system, inspired by the behavior and the neuronal architecture of the early primate visual system, is presented, which breaks down the complex problem of scene understanding by rapidly selecting conspicuous locations to be analyzed in detail.

What Can 1 Million Trials Tell Us About Visual Search?

In a typical visual search experiment, observers look through a set of items for a designated target that may or may not be present. Reaction time (RT) is measured as a function of the number of

Making a “Completely Blind” Image Quality Analyzer

This work has recently derived a blind IQA model that only makes use of measurable deviations from statistical regularities observed in natural images, without training on human-rated distorted images, and, indeed, without any exposure to distorted images.