• Corpus ID: 1488600

Depth of Field Rendering Algorithms for Virtual Reality

@inproceedings{Konrad2015DepthOF,
  title={Depth of Field Rendering Algorithms for Virtual Reality},
  author={Robert Konrad},
  year={2015},
  url={https://api.semanticscholar.org/CorpusID:1488600}
}
This project has investigated different depth of field rendering algorithms that give accurate reproductions of Depth of field (retinal blur) on scenes displayed in virtual reality.

Figures and Tables from this paper

Algorithms for rendering depth of field effects in computer graphics

This paper surveys depth of field approaches in computer graphics, from its introduction to the current state of the art.

Accurate Depth of Field Simulation in Real Time

This work presents a new post processing method of simulating depth of field based on accurate calculations of circles of confusion, which derives actual scene depth information directly from the existing depth buffer, requires no specialized rendering passes, and allows easy integration into existing rendering applications.

Depth perception with gaze-contingent depth of field

It is found that GC DOF increases subjective perceived realism and depth and can contribute to the perception of ordinal depth and distance between objects, but it is limited in its accuracy.

Depth of field rendering via adaptive recursive filtering

A new post-processing method based on a recursive filtering process, which adaptively smooths the image frame with local depth and circle of confusion information, which produces spatially-varying smoothed results.

A lens and aperture camera model for synthetic image generation

This paper extends the traditional pin-hole camera projection geometry to a more realistic camera model which approximates the effects of a lens and an aperture function of an actual camera, allowing the generation of synthetic images which have a depth of field, can be focused on an arbitrary plane, and also permits selective modeling of certain optical characteristics of a Lens.

Vergence-accommodation conflicts hinder visual performance and cause visual fatigue.

This display is used to evaluate the influence of focus cues on perceptual distortions, fusion failures, and fatigue and shows that when focus cues are correct or nearly correct, the time required to identify a stereoscopic stimulus is reduced, stereoacuity in a time-limited task is increased, and distortions in perceived depth are reduced.

Simulated disparity and peripheral blur interact during binocular fusion.

The results demonstrate that a naturalistic distribution of depth-dependent blur may improve 3-D virtual reality, and that interruptions of this pattern which flatten the distribution of retinal blur may adversely affect binocular fusion.

Bilateral filtering for gray and color images

In contrast with filters that operate on the three bands of a color image separately, a bilateral filter can enforce the perceptual metric underlying the CIE-Lab color space, and smooth colors and preserve edges in a way that is tuned to human perception.

Physiological measures of presence in stressful virtual environments

Physiological reaction satisfied the requirements for a measure of presence, change in skin conductance did to a lesser extent, and that change inskin temperature did not, and inclusion of a passive haptic element in the VE significantly increased presence and that for presence evoked.