Learn More
—In addition to their role as simulation engines, modern supercomputers can be harnessed for scientific visual-ization. Their extensive concurrency, parallel storage systems, and high-performance interconnects can mitigate the expanding size and complexity of scientific datasets and prepare for in situ visualization of these data. In ongoing research into(More)
Effective 3D streamline placement and visualization play an essential role in many science and engineering disciplines. The main challenge for effective streamline visualization lies in seed placement, i.e., where to drop seeds and how many seeds should be placed. Seeding too many or too few streamlines may not reveal flow features and patterns either(More)
The ability to identify and present the most essential aspects of time-varying data is critically important in many areas of science and engineering. This paper introduces an importance-driven approach to time-varying volume data visualization for enhancing that ability. By conducting a block-wise analysis of the data in the joint feature-temporal space, we(More)
The acquisition of expression of hTERT, the catalytic subunit of the telomerase enzyme, seems to be an essential step in the development of a majority of human tumors. However, little is known about the mechanisms preventing telomerase gene expression in normal and transformed cells that do not express hTERT. Using a methylation-specific PCR-based assay, we(More)
L everaging the power of high-performance supercomputers and advanced numerical algorithms , scientists can perform 3D direct numerical simulations of many complex phenomena in unprecedented detail, leading to new sci-entic discoveries. Nowadays, a typical scientic simulation might produce data containing several hundred million voxels, hundreds of time(More)
As scientific supercomputing moves toward petascale and exascale levels, in situ visualization stands out as a scalable way for scientists to view the data their simulations generate. This full picture is crucial particularly for capturing and understanding highly intermittent transient phenomena, such as ignition and extinction events in turbulent(More)
The ever-increasing amounts of simulation data produced by scientists demand high-end parallel visualization capability. However, image compositing, which requires inter-processor communication, is often the bottleneck stage for parallel rendering of large volume data sets. Existing image compositing solutions either incur a large number of messages(More)
With the onset of extreme-scale computing, I/O constraints make it increasingly difficult for scientists to save a sufficient amount of raw simulation data to persistent storage. One potential solution is to change the data analysis pipeline from a post-process centric to a concurrent approach based on either in-situ or in-transit processing. In this(More)
We present the design of a scalable parallel pathline construction method for visualizing large time-varying 3D vector fields. A 4D (i.e., time and the 3D spatial domain) representation of the vector field is introduced to make a time-accurate depiction of the flow field. This representation also allows us to obtain pathlines through streamline tracing in(More)
This paper presents a parallel visualization pipeline implemented at the Pittsburgh Supercomputing Center (PSC) for studying the largest earthquake simulation ever performed. The simulation employs 100 million hexahedral cells to model 3D seismic wave propagation of the 1994 Northridge earthquake. The time-varying dataset produced by the simulation requires(More)