Learn More
In this paper, we present a space efficient algorithm for speeding up isosurface extraction. Even though there exist algorithms that can achieve optimal search performance to identify isosurface cells, they prove impractical for large datasets due to a high storage overhead. With the dual goals of achieving fast isosurface extraction and simultaneously(More)
In a visualization of a three-dimensional dataset, the insights gained are dependent on what is occluded and what is not. Suggestion of interesting viewpoints can improve both the speed and efficiency of data understanding. This paper presents a view selection method designed for volume rendering. It can be used to find informative views for a given scene,(More)
(c) (b) (a) Figure 1: Tornado dataset rendered with different appearance textures. (a) with LIC texture pre-generated from straight flow. (b) with a color tube texture. Lighting is used to enhance the depth perception. (c) with a 2D paintbrush texture. Abstract In this paper we present an interactive texture-based technique for visualizing three-dimensional(More)
We present an interactive texture-based algorithm for visualizing three-dimensional steady and unsteady vector fields. The goal of the algorithm is to provide a general volume rendering framework allowing the user to compute three-dimensional flow textures interactively and to modify the appearance of the visualization on the fly. To achieve our goal, we(More)
This paper presents an interactive global visualization technique for dense vector fields using levels of detail. We introduce a novel scheme which combines an error-controlled hierarchical approach and hardware acceleration to produce high resolution visualizations at interactive rates. Users can control the trade-off between computation time and image(More)
Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways(More)
Radiation-hardened processors are designed to be resilient against soft errors but such processors are slower than Commercial Off-The-Shelf (COTS) processors as well significantly costlier. In order to mitigate the high costs, software techniques such as task re-executions must be deployed together with adequately hardened processors to provide reliability.(More)
We present an interactive visualization technique for spatial probability density function data. These are datasets that represent a spatial collection of random variables, and contain a number of possible outcomes for each random variable. It is impractical to visualize all the information at each spatial location as it will quickly lead to a cluttered(More)
  • 1