Learn More
The goal of the Network Weather Service is to provide accurate forecasts of dy namically changing performance characteristics from a distributed set of metacom puting resources Providing a ubiquitous service that can both track dynamic per formance changes and remain stable in spite of them requires adaptive programming techniques an architectural design(More)
Ensembles of distributed, heterogeneous resources, also known as Computational Grids, have emerged as critical platforms for high-performance and resource-intensive applications. Such platforms provide the potential for applications to aggregate enormous bandwidth, computational power, memory, secondary storage, and other resources during a single(More)
The ongoing global effort of genome sequencing is making large scale comparative proteomic analysis an intriguing task. The Encyclopedia of Life (EOL; http://eol.sdsc.edu) project aims to provide current functional and structural annotations for all available proteomes, a computational challenge never seen before in biology. Using an integrative genome(More)
The primary goal in the creation of Grids is to provide unified and coherent access to distributed computing, data storage and analysis, instruments, and other resources to advance scientific exploration. Grids combine multiple complex and interdependent systems that span several administrative domains. This complexity poses challenges for both the(More)
Porting large applications to distributed computing platforms is a challenging task from a software engineering perspective. The Computational Grid has gained tremendous popularity as it aggregates unprecedented amounts of compute and storage resources by means of increasingly high performance network technology. The primary aim of this paper is to(More)
Runtime irreproducibility complicates application performance evaluation on today’s high performance computers. Performance can vary significantly between seemingly identical runs; this presents a challenge to benchmarking as well as a user, who is trying to determine whether the change they made to their code is an actual improvement. In order to(More)
The goal of the Encyclopedia of Life (EOL) Project is to predict structural information for all proteins, in all organisms. This calculation presents challenges both in terms of the scale of the computational resources required (approximately 1.8 million CPU hours), as well as in data and workflow management. While tools are available that solve some(More)
In this paper we focus on the problem of making short and medium term forecasts of CPU availability on time‐shared Unix systems. We evaluate the accuracy with which availability can be measured using Unix load average, the Unix utility vmstat, and the Network Weather Service CPU sensor that uses both. We also examine the autocorrelation between successive(More)
Interactive program analysis tools are often tailored to one particular representation of programs, making adaptation to a new language costly. One way to ease adaptability is to introduce an intermediate abstraction—an adaptation layer—between an existing language representation and the program analysis tool. This adaptation layer translates(More)
Shopping bots are automated software applications that allow consumers to easily search for and compare product prices from online retailers. In a previous project, researchers investigated the functionality and performance of e-commerce shopping bots. The purpose of this project is to test the temporal stability of their findings two years later. Both(More)