Learn More
—Modern and emerging data centers are presenting unprecedented demands in terms of cost and energy consumption , far outpacing architectural advances related to economies of scale. Consequently, blade designs exhibit significant cost and power inefficiencies, particularly in the memory system. For example , we observe that modern blades are often(More)
IMPORTANCE Transformation of US health care from volume to value requires meaningful quantification of costs and outcomes at the level of individual patients. OBJECTIVE To measure the association of a value-driven outcomes tool that allocates costs of care and quality measures to individual patient encounters with cost reduction and health outcome(More)
The Open Archives Initiative [www.openarchives.org] has developed a me tadata harvesting protocol to further its aim of efficient dissemination of content through interoperability standards. In early 2001, at meetings in the U.S. and Europe, the version of the protocol to be used for beta testing was announced. The HTTP-based protocol uses URLs for queries(More)
—A novel ring-resonator-based integrated photonic chip with ultrafine frequency resolution, providing program-mable, stable, and accurate optical-phase control is demonstrated. The ability to manipulate the optical phase of the individual frequency components of a signal is a powerful tool for optical communications , signal processing, and RF photonics(More)
The Networked Digital Library of Theses and Dissertations (NDLTD) is a collaborative effort of universities around the world to promote creating, archiving, distributing and accessing Electronic Theses and Dissertations (ETDs). Since its inception in 1996, over a hundred universities have joined the initiative, underscoring the importance institutions place(More)
SRW/U (the Search/Retrieve Webservice) and OAI (Open Archives Initiative) are both modern information retrieval protocols developed by distinct groups from different backgrounds at around the same time. This article sets out to briefly contrast the two protocols' aims and approaches, and then to look at some novel ways in which they have been or may be(More)
The Scalable HeterOgeneous Computing (SHOC) benchmark suite was released in 2010 as a tool to evaluate the stability and performance of emerging heterogeneous architectures and to compare different programming models for compute devices used in those architectures. Since then, high-performance computing (HPC) system architectures have increasingly(More)
Traditionally, data warehousing workloads have been processed using CPU-focused clusters, such as those that make up the bulk of available machines in Amazon's EC2, and the focus on improving analytics performance has been to utilize a homogenous, multi-threaded CPU environment with optimized algorithms for this infrastructure. The increasing(More)