Decoupled Space and Time Sampling of Motion and Defocus Blur for Unified Rendering of Transparent and Opaque Objects

@article{Widmer2016DecoupledSA,
  title={Decoupled Space and Time Sampling of Motion and Defocus Blur for Unified Rendering of Transparent and Opaque Objects},
  author={Sven Widmer and Dominik Wodniok and Daniel Thul and Stefan Guthe and Michael Goesele},
  journal={Comput. Graph. Forum},
  year={2016},
  volume={35},
  pages={441-450}
}
  • Sven Widmer, Dominik Wodniok, +2 authors Michael Goesele
  • Published in Comput. Graph. Forum 2016
  • Computer Science
  • We propose a unified rendering approach that jointly handles motion and defocus blur for transparent and opaque objects at interactive frame rates. Our key idea is to create a sampled representation of all parts of the scene geometry that are potentially visible at any point in time for the duration of a frame in an initial rasterization step. We store the resulting temporally-varying fragments (t-fragments) in a bounding volume hierarchy which is rebuild every frame using a fast spatial median… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Citations

    Publications citing this paper.
    SHOWING 1-3 OF 3 CITATIONS

    Multi-Layer Depth of Field Rendering with Tiled Splatting

    VIEW 4 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Real-time motion blur using extruded triangles

    VIEW 1 EXCERPT
    CITES BACKGROUND