David H. Rogers

Learn More
Huge salt formations, trapping large untapped oil and gas reservoirs, lie in the deepwater region of the Gulf of Mexico. Drilling in this region is high-risk and drilling failures have led to well abandonments, with each costing tens of millions of dollars. Salt tectonics plays a central role in these failures. To explore the geomechanical interactions(More)
—Extreme scale scientific simulations are leading a charge to exascale computation, and data analytics runs the risk of being a bottleneck to scientific discovery. Due to power and I/O constraints, we expect in situ visualization and analysis will be a critical component of these workflows. Options for extreme scale data analysis are often presented as a(More)
—Scientific workflow systems support different work-flow representations, operational modes and configurations. However , independent of the system used, end users need to track the status of their workflows in real time, be notified of execution anomalies and failures automatically, perform troubleshooting and automate the analysis of the workflow to help(More)
Exascale supercomputing will embody many revolutionary changes in the hardware and software of high-performance computing. For example, projected limitations in power and I/O-system performance will fundamentally change visualization and analysis workflows. A traditional post-processing workflow involves storing simulation results to disk and later(More)
We present an innovative application developed at Sandia National Laboratories for visual debugging of unstructured finite element physics codes. Our tool automatically locates anomalous regions, such as inverted elements or nodes whose variable values lie outside a prescribed range, then extracts mesh subsets around these features for detailed examination.(More)
An important challenge encountered during post-processing of finite element analyses is the visualizing of three-dimensional fields of real-valued second-order tensors. Namely, as finite element meshes become more complex and detailed, evaluation and presentation of the principal stresses becomes correspondingly problematic. In this paper, we describe(More)
—Post-processing visualization pipelines are traditionally used to gain insight from simulation data. However, changes to the system architecture for high-performance computing (HPC), dictated by the exascale goal, have limited the applicability of post-processing visualization. As an alternative, in-situ pipelines are proposed in order to enhance the(More)
The workflow paradigm can provide the means to describe the complete functional pipeline for a scientific experiment and therefore expose the underlying scientific processes for enabling the reproducibility of results. However, current means for exposing such information are tied closely to the individual workflow engines and there is no existing method(More)