Learn More
Remarkable observational advances have established a compelling cross-validated model of the Universe. Yet, two key pillars of this model -- dark matter and dark energy -- remain mysterious. Next-generation sky surveys will map billions of galaxies to explore the physics of the 'Dark Universe'. Science requirements for these surveys demand simulations at(More)
Supercomputing is evolving towards hybrid and accelerator-based architectures with millions of cores. The HACC (Hardware/Hybrid Accelerated Cosmology Code) framework exploits this diverse landscape at the largest scales of problem size, obtaining high scalability and sustained performance. Developed to satisfy the science requirements of cosmological(More)
—Mesh tessellations are indispensable tools for analyzing point data because they transform sparse discrete samples into dense continuous functions. Meshing the output of petascale simulations, however, can be as data-intensive as the simulations themselves and often must be executed in parallel on the same supercomputers in order to fit in memory. To date,(More)
Datasets with tens of millions of galaxies present new challenges for the analysis of spatial clustering. We have built a framework, that integrates a database of object catalogs, tools for creating masks of bad regions, and a fast (NlogN) correlation code. This system has enabled unprecedented efficiency in carrying out the analysis of galaxy clustering in(More)
Large-scale simulations can produce hundreds of terabytes to petabytes of data, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of(More)
  • 1