Neural BRDF Representation and Importance Sampling

  title={Neural BRDF Representation and Importance Sampling},
  author={Alejandro Sztrajman and Gilles Rainer and Tobias Ritschel and Tim Weyrich},
  journal={Computer Graphics Forum},
Controlled capture of real‐world material appearance yields tabulated sets of highly realistic reflectance data. In practice, however, its high memory footprint requires compressing into a representation that can be used efficiently in rendering while remaining faithful to the original. Previous works in appearance encoding often prioritized one of these requirements at the expense of the other, by either applying high‐fidelity array compression strategies not suited for efficient queries… 

Metappearance: Meta-Learning for Visual Appearance Reproduction

This work suggests to combine both techniques end-to-end using meta-learning: over-fit onto a single problem instance in an inner loop, while also learning how to do so efficiently in an outer-loop that builds intuition over many optimization runs.

BSDF Importance Baking: A Lightweight Neural Solution to Importance Sampling Parametric BSDFs

This paper has completely brought parametric BSDF importance sampling to the precomputation stage, avoiding heavy runtime computation and reduced noise levels on rendering results with a rich set of appearances, including both conductors and dielectrics with anisotropic roughness.

NeuLighting: Neural Lighting for Free Viewpoint Outdoor Scene Relighting with Unconstrained Photo Collections

The high-fidelity renderings under novel views and illumination prove the superiority of the NeuLighting method against state-of-the-art relighting solutions.

Physically Based Rendering of Functionally Defined Objects

Abstract Functionally defined objects for realistic scenes are offered. We describe physically based visualization of three-dimensional objects based on perturbation functions; i.e., the rendering of

Lightweight Neural Basis Functions for All-Frequency Shading

This paper introduces a representation neural network that takes any general 2D spherical function as input and projects it onto the latent space as coefficients of the neural basis functions, and designs several lightweight neural networks that perform different types of computation, giving them different computational properties.

Learning to Learn and Sample BRDFs

This work proposes a method to accelerate the joint process of physically acquiring and learning neural Bi-directional Reflectance Distribution Function (BRDF) models and shows that meta-learning can be extended to optimize the physical sampling pattern, too.

HyperTime: Implicit Neural Representation for Time Series

This paper analyzes the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed, and proposes a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.

Differentiable Point-Based Radiance Fields for Efficient View Synthesis

This work proposes a differentiable rendering algorithm for efficient novel view synthesis that trains two orders of magnitude faster than STNeRF and renders at a near interactive rate, while maintaining high image quality and temporal coherence even without imposing any temporal-coherency regularizers.

Neural Layered BRDFs

This paper proposes to perform layering in the neural space, by compressing BRDFs into latent codes via a proposed representation neural network, and performing a learned layering operation on these latent vectors via a layering network.

A Sparse Non-parametric BRDF Model

This paper presents a novel sparse non-parametric Bidirectional Reflectance Distribution Function (BRDF) model derived using a machine learning approach to represent the space of possible BRDFs using



AutoInt: Automatic Integration for Fast Neural Volume Rendering

This work proposes automatic integration, a new framework for learning efficient, closed-form solutions to integrals using coordinate-based neural networks, and improves a tradeoff between rendering speed and image quality by improving render times by greater than 10× with a tradeoffs of reduced image quality.

Invertible Neural BRDF for Object Inverse Rendering

A novel neural network-based BRDF model and a Bayesian framework for object inverse rendering, i.e., joint estimation of reflectance and natural illumination from a single image of an object of known geometry, which can be computed efficiently with stochastic gradient descent.

An Adaptive BRDF Fitting Metric

It is shown that the image‐driven isotropic BRDF fits generalize well to other light conditions, and that depending on the measured material, a different weighting of errors with respect to the measured BRDF is necessary.

DeepBRDF: A Deep Representation for Manipulating Measured BRDF

DeepBRDF is presented, a deep‐learning‐based representation that can significantly reduce the dimensionality of measured BRDFs while enjoying high quality of recovery and is clearly outperforms PCA‐based strategies in BRDF data compression and is more robust.

Unified Neural Encoding of BTFs

A unified network architecture that is trained on a variety of materials, and which projects reflectance measurements to a shared latent parameter space is proposed, which is inspired by autoencoders and shows that the latent space is well‐behaved and can be sampled from.

Offline Deep Importance Sampling for Monte Carlo Path Tracing

This paper proposes an offline, scene‐independent deep‐learning approach that can importance sample first‐bounce light paths for general scenes without the need of the costly online training, and can start guiding path sampling with as little as 1 sample per pixel.

Deep appearance modeling: A survey

  • Yue Dong
  • Computer Science
    Vis. Informatics
  • 2019

Neural BTF Compression and Interpolation

A neural network‐based BTF representation inspired by autoencoders is proposed: the encoder compresses each texel to a small set of latent coefficients, while the decoder additionally takes in a light and view direction and outputs a single RGB vector at a time, eliminating the need for linear interpolation between discrete samples.

An adaptive parameterization for efficient material acquisition and rendering

This work proposes a new parameterization that automatically adapts to the behavior of a material, warping the underlying 4D domain so that most of the volume maps to regions where the BRDF takes on non-negligible values, while irrelevant regions are strongly compressed.

Learning to Importance Sample in Primary Sample Space

A novel importance sampling technique that uses a neural network to learn how to sample from a desired density represented by a set of samples is proposed that is agnostic of underlying light transport effects, and can be combined with an existing rendering technique by treating it as a black box.