# Refinement Criteria Based on f-Divergences

@inproceedings{Rigau2003RefinementCB, title={Refinement Criteria Based on f-Divergences}, author={Jaume Rigau and M. Feixas and M. Sbert}, booktitle={Rendering Techniques}, year={2003} }

In several domains a refinement criterion is often needed to decide whether to go on or to stop sampling a signal. When the sampled values are homogeneous enough, we assume that they represent the signal fairly well and we do not need further refinement, otherwise more samples are required, possibly with adaptive subdivision of the domain. For this purpose, a criterion which is very sensitive to variability is necessary. In this paper we present a family of discrimination measures, the f… Expand

#### 40 Citations

Refinement criteria for global illumination using convex funcions

- Computer Science
- 2003

This paper presents a family of discrimination measures, the f-divergences, meeting the requirement for adaptive refinement in ray-tracing, and obtains significantly better results than with classic criteria, showing that f-D divergences are worth further investigation in computer graphics. Expand

A new refinement criterion for adaptive sampling in path tracing

- Mathematics
- 2010 IEEE International Symposium on Industrial Electronics
- 2010

Monte Carlo based methods such as path tracing are the only to do the physically correct simulations of global illumination. Due to its high capability to exploit acceleration structures and SIMD… Expand

Viewpoint-based simplification using f-divergences

- Mathematics, Computer Science
- Inf. Sci.
- 2008

The best half-edge collapse as a decimation criterion is applied to determine the error introduced by an edge collapse in a viewpoint-based simplification method for polygonal meshes driven by several f-divergences such as Kullback-Leibler, Hellinger and Chi-Square. Expand

Adaptive sampling for environment mapping

- Mathematics, Computer Science
- SCCG
- 2010

This paper develops an adaptation scheme that is based on ray differentials and does not require neighbor finding and complex data structures in higher dimensions, and provides error estimates, which are essential in predictive rendering applications. Expand

Adaptive Sampling with Error Control

- Mathematics
- 2010

This paper proposes a multi-dimensional adaptive sampling algorithm for rendering applications. Unlike importance sampling, adaptive sampling does not try to mimic the integrand with analytically… Expand

AM-GM Difference Based Adaptive Sampling for Monte Carlo Global Illumination

- Mathematics, Computer Science
- ICCSA
- 2007

A new homogeneity measure, namely the arithmetic mean - geometric mean difference (abbreviated to AM - GM difference) is developed, which is developed to execute adaptive sampling efficiently, which can perform significantly better than classic ones. Expand

Multidimensional adaptive sampling and reconstruction for ray tracing

- Mathematics
- SIGGRAPH 2008
- 2008

We present a new adaptive sampling strategy for ray tracing. Our technique is specifically designed to handle multidimensional sample domains, and it is well suited for efficiently generating images… Expand

A New Adaptive Sampling Technique for Monte Carlo Global Illumination

- Mathematics, Computer Science
- 2007 10th IEEE International Conference on Computer-Aided Design and Computer Graphics
- 2007

Investigation of the use of entropy in the domain of information theory to measure pixel quality and to do adaptive sampling finds that the nonextensive Tsallis entropy driven adaptive sampling significantly outperforms the existing methods. Expand

Fuzziness Driven Adaptive Sampling for Monte Carlo Global Illuminated Rendering

- Computer Science
- Computer Graphics International
- 2006

This paper makes use of the fuzzy uncertainty existing in image synthesis and exploit the formal concept of fuzziness in fuzzy set theory to evaluate pixel quality to run adaptive sampling efficiently. Expand

KD-tree based parallel adaptive rendering

- Mathematics, Computer Science
- The Visual Computer
- 2012

A two-level framework for adaptive sampling in parallel is introduced to reduce the computation time and control the memory cost and novel kd-tree based strategies are introduced to measure space’s error value and generate anisotropic Poisson disk samples. Expand

#### References

SHOWING 1-10 OF 51 REFERENCES

Antialiased ray tracing by adaptive progressive refinement

- Computer Science
- SIGGRAPH '89
- 1989

The goals of the system are to produce high quality antialiased images at a modest average sample rate, and to refine the image progressively so that the image is available in a usable form early and is refined gradually toward the final result. Expand

Adaptive Smpling and Bias Estimation in Path Tracing

- Computer Science
- Rendering Techniques
- 1997

A new refinement criterion is introduced, which takes human perception and limitations of display devices into account by incorporating the tone-operator, which can lead to a significant reduction in the overall RMS-error and even more important that noisy spots are eliminated. Expand

Wavelet radiosity

- Computer Science
- SIGGRAPH '93
- 1993

This paper shows that the hierarchical radiosity formulation is an instance of a more general set of methods based on wavelet theory that offers a unified view of both higher order element approaches to radiosity and the hierarchicalRadiosity methods. Expand

A rapid hierarchical radiosity algorithm

- Computer Science
- SIGGRAPH
- 1991

Standard techniques for shooting and gathering can be used with the hierarchical representation to solve for equilibrium radiosities, but the paper also discusses using a brightness-weighted error criteria, in conjunction with multigridding, to even more rapidly progressively refine the image. Expand

Information-Theoretic Oracle Based on Kernel Smoothness for Hierarchical Radiosity

- Computer Science
- Eurographics
- 2002

This paper presents a robust information-theoretic refinement criterion (oracle) based on kernel smoothness for hierarchical radiosity that improves on previous ones in that at equal cost it gives a better discretisation, approaching the optimal one from an information theory point of view. Expand

Generating antialiased images at low sampling densities

- Mathematics, Computer Science
- SIGGRAPH '87
- 1987

This paper describes a program that focuses on constructing an antialiased digital picture from point samples without resorting to extremely high sampling densities, and an algorithm is presented for fast generation of nonuniform sampling patterns that are optimal in some sense. Expand

Statistically optimized sampling for distributed ray tracing

- Mathematics, Computer Science
- SIGGRAPH '85
- 1985

In this work, a relationship between the number of sample rays and the quality of the estimate of this integral is derived and the algorithm has been optimized through the use of statistical testing and stratified sampling. Expand

Antialiasing through stochastic sampling

- Mathematics, Computer Science
- SIGGRAPH '85
- 1985

Stochastic sampling techniques allow the construction of alias-free approximations to continuous functions using discrete calculations and can be applied spatiotemporally as well as to other aspects of scene simulation. Expand

A perceptually based adaptive sampling algorithm

- Computer Science
- SIGGRAPH
- 1998

A perceptually based approach for selecting image samples has been developed and the resulting new image quality model was inserted into an image synthesis program by first modifying the rendering algorithm so that it computed a wavelet representation. Expand

A generalized divergence measure for robust image registration

- Computer Science
- IEEE Trans. Signal Process.
- 2003

This paper defines a new generalized divergence measure, namely, the Jensen-Renyi (1996, 1976) divergence, and proposes a new approach to the problem of image registration based on it, to measure the statistical dependence between consecutive ISAR image frames. Expand