Learn More
  • Thorsten W Becker, Jules T Browaeys, Thomas H Jordan, Evans, Kendall, J.-M Willemann +3 others
  • 2007
The coherence of azimuthal seismic anisotropy, as inferred from shear-wave splitting measurements, decreases with the relative distance between stations. Stochastic models of a two-dimensional vector field defined by a von Karma'n [T. von Karma'n, Progress in the statistical theory of turbulence, J. Mar. Res., 7 (1948) 252–264.] autocorrelation function(More)
This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard(More)
We describe the services, architecture and application of the GriPhyN Virtual Data System, a suite of components and services that allow users to describe virtual data products in declarative terms, discover definitions and assemble workflows based on those definitions, and execute the resulting workflows on Grid resources. We show how these(More)
Petascale simulations are needed to understand the rupture and wave dynamics of the largest earthquakes at shaking frequencies required to engineer safe structures (> 1 Hz). Toward this goal, we have developed a highly scalable, parallel application (AWP-ODC) that has achieved “M8”: a full dynamical simulation of a magnitude-8 earthquake on the(More)
SUMMARY The GriPhyN Virtual Data System provides a suite of components and services for data-intensive sciences that enables scientists to systematically and efficiently describe, discover, and share large scale data and computation resources. We describe the design and implementation of such middleware services in terms of a virtual data system interface(More)
Researchers at the Southern California Earthquake Center (SCEC) use large-scale grid-based scientific workflows to perform seismic hazard research as a part of SCEC's program of earthquake system science research. The scientific goal of the SCEC CyberShake project is to calculate probabilistic seismic hazard curves for sites in Southern California. For each(More)
The Southern California Earthquake Center initiated a major large-scale earthquake simulation called TeraShake. The simulations propagated seismic waves across a domain of 600x300x80 km at 200 meter resolution, some of the largest and most detailed earthquake simulations of the southern San Andreas fault. The output from a single simulation may be as large(More)
Scientific workflows are a common computational model for performing scientific simulations. They may include many jobs, many scientific codes, and many file dependencies. Since scientific workflow applications may include both high-performance computing (HPC) and high-throughput computing (HTC) jobs, meaningful performance metrics are difficult to define,(More)
We have developed a highly scalable and efficient GPU-based finite-difference code (AWP) for earthquake simulation that implements high throughput, memory locality, communication reduction and communication/computation overlap and achieves linear scalability on Cray XK7 Titan at ORNL and NCSA's Blue Waters system. We simulate realistic 0-10 Hz earthquake(More)