Learn More
We measure the large-scale real-space power spectrum P (k) using a sample of 205,443 galaxies from the Sloan Digital Sky Survey, covering 2417 effective square degrees with mean redshift z ≈ 0.1. We employ a matrix-based method using pseudo-Karhunen-Lò eve eigenmodes , producing uncorrelated minimum-variance measurements in 22 k-bands of both the clustering(More)
Remarkable observational advances have established a compelling cross-validated model of the Universe. Yet, two key pillars of this model -- dark matter and dark energy -- remain mysterious. Next-generation sky surveys will map billions of galaxies to explore the physics of the 'Dark Universe'. Science requirements for these surveys demand simulations at(More)
  • Max Tegmark, Michael A Strauss, Michael R Blanton, Kevork Abazajian, Scott Dodelson, Havard Sandvik +56 others
  • 2003
We measure cosmological parameters using the three-dimensional power spectrum P (k) from over 200,000 galaxies in the Sloan Digital Sky Survey (SDSS) in combination with WMAP and other data. Our results are consistent with a " vanilla " flat adiabatic ΛCDM model without tilt (ns = 1), running tilt, tensor modes or massive neutrinos. Adding SDSS information(More)
Supercomputing is evolving towards hybrid and accelerator-based architectures with millions of cores. The HACC (Hardware/Hybrid Accelerated Cosmology Code) framework exploits this diverse landscape at the largest scales of problem size, obtaining high scalability and sustained performance. Developed to satisfy the science requirements of cosmological(More)
Datasets with tens of millions of galaxies present new challenges for the analysis of spatial clustering. We have built a framework, that integrates a database of object catalogs, tools for creating masks of bad regions, and a fast (NlogN) correlation code. This system has enabled unprecedented efficiency in carrying out the analysis of galaxy clustering in(More)
—Mesh tessellations are indispensable tools for analyzing point data because they transform sparse discrete samples into dense continuous functions. Meshing the output of petascale simulations, however, can be as data-intensive as the simulations themselves and often must be executed in parallel on the same supercomputers in order to fit in memory. To date,(More)
Large-scale simulations can produce hundreds of terabytes to petabytes of data, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of(More)
  • Daniel J Eisenstein, David W Hogg, Roman Scoccimarro, Michael R Blanton, Robert C Nichol, Ryan Scranton +33 others
  • 2005
We present the large-scale correlation function measured from a spectroscopic sample of 46,748 luminous red galaxies from the Sloan Digital Sky Survey. The survey region covers 0.72 h À3 Gpc 3 over 3816 deg 2 and 0:16 < z < 0:47, making it the best sample yet for the study of large-scale structure. We find a well-detected peak in the correlation function at(More)
  • 1