Corpus ID: 237502952

An Inverse Procedural Modeling Pipeline for SVBRDF Maps

  title={An Inverse Procedural Modeling Pipeline for SVBRDF Maps},
  author={Yiwei Hu and Chengan He and Valentin Deschaintre and Julie Dorsey and Holly E. Rushmeier},
Fig. 1. On the left we show a virtual scene textured only with procedural materials generated from pixel map exemplars with our method. On the right we show examples of the editing made possible by our procedural representation on the wall and floor. We change the brick color and make it deeper and smoother. We also modify the pavement color variation, its regularity and show broken tiles. These edits are made solely through the parameters made available by our method. All of the input… Expand


AppWand: editing measured materials using appearance-driven optimization
A new approach to editing spatially- and temporally-varying measured materials that adopts a stroke-based workflow that is independent of the underlying reflectance model and shows edits to both analytic and non-parametric representations in examples from several material databases. Expand
Generative Modelling of BRDF Textures from Flash Images
An infinite and diverse spatial field of BRDF model parameters that allows rendering in complex scenes and illuminations, matching the appearance of the input picture under matching light is produced. Expand
Learning a Neural 3D Texture Space From 2D Exemplars
A generative model of 2D and 3D natural textures with diversity, visual fidelity and at high computational efficiency is suggested, enabled by a family of methods that extend ideas from classic stochastic procedural texturing to learned, deep, non-linearities. Expand
Image quilting for texture synthesis and transfer
This work uses quilting as a fast and very simple texture synthesis algorithm which produces surprisingly good results for a wide range of textures and extends the algorithm to perform texture transfer — rendering an object with a texture taken from a different object. Expand
A novel framework for inverse procedural texture modeling
This work presents an example-based framework to automatically select procedural models and estimate parameters for high level textures and shows that the inverse modeling system can produce high-quality procedural textures for both structural and non-structural textures. Expand
Inverse shade trees for non-parametric material representation and editing
An Inverse Shade Tree framework is introduced that provides a general approach to estimating the "leaves" of a user-specified shade tree from high-dimensional measured datasets of appearance, and the ability to reduce multi-gigabyte measured dataset of the Spatially-Varying Bidirectional Reflectance Distribution Function (SVBRDF) into a compact representation that may be edited in real time is demonstrated. Expand
Material matting
A solution inspired by natural image matting and texture synthesis to the material matting problem is presented, which allows separating a measured spatially-varying material into simpler foreground and background component materials and a corresponding opacity map. Expand
Poisson image editing
Using generic interpolation machinery based on solving Poisson equations, a variety of novel tools are introduced for seamless editing of image regions. The first set of tools permits the seamlessExpand
AppProp: all-pairs appearance-space edit propagation
This work presents an intuitive and efficient method for editing the appearance of complex spatially-varying datasets, such as images and measured materials that generalizes prior methods while providing significant improvements in generality, robustness and efficiency. Expand
Single-image SVBRDF capture with a rendering-aware deep network
This work tackles lightweight appearance capture by training a deep neural network to automatically extract and make sense of visual cues from a single image, and designs a network that combines an encoder-decoder convolutional track for local feature extraction with a fully-connected track for global feature extraction and propagation. Expand