ترغب بنشر مسار تعليمي؟ اضغط هنا

In this paper we introduce the Farpoint simulation, the latest member of the Hardware/Hybrid Accelerated Cosmology Code (HACC) gravity-only simulation family. The domain covers a volume of (1000$h^{-1}$Mpc)$^3$ and evolves close to two trillion parti cles, corresponding to a mass resolution of $m_psim 4.6cdot 10^7 h^{-1}$M$_odot$. These specifications enable comprehensive investigations of the galaxy-halo connection, capturing halos down to small masses. Further, the large volume resolves scales typical of modern surveys with good statistical coverage of high mass halos. The simulation was carried out on the GPU-accelerated system Summit, one of the fastest supercomputers currently available. We provide specifics about the Farpoint run and present an initial set of results. The high mass resolution facilitates precise measurements of important global statistics, such as the halo concentration-mass relation and the correlation function down to small scales. Selected subsets of the simulation data products are publicly available via the HACC Simulation Data Portal.
The Last Journey is a large-volume, gravity-only, cosmological N-body simulation evolving more than 1.24 trillion particles in a periodic box with a side-length of 5.025Gpc. It was implemented using the HACC simulation and analysis framework on the B G/Q system, Mira. The cosmological parameters are chosen to be consistent with the results from the Planck satellite. A range of analysis tools have been run in situ to enable a diverse set of science projects, and at the same time, to keep the resulting data amount manageable. Analysis outputs have been generated starting at redshift z~10 to allow for construction of synthetic galaxy catalogs using a semi-analytic modeling approach in post-processing. As part of our in situ analysis pipeline we employ a new method for tracking halo sub-structures, introducing the concept of subhalo cores. The production of multi-wavelength synthetic sky maps is facilitated by generating particle lightcones in situ, also beginning at z~10. We provide an overview of the simulation set-up and the generated data products; a first set of analysis results is presented. A subset of the data is publicly available.
We describe the first major public data release from cosmological simulations carried out with Argonnes HACC code. This initial release covers a range of datasets from large gravity-only simulations. The data products include halo information for mul tiple redshifts, down-sampled particles, and lightcone outputs. We provide data from two very large LCDM simulations as well as beyond-LCDM simulations spanning eleven w0-wa cosmologies. Our release platform uses Petrel, a research data service, located at the Argonne Leadership Computing Facility. Petrel offers fast data transfer mechanisms and authentication via Globus, enabling simple and efficient access to stored datasets. Easy browsing of the available data products is provided via a web portal that allows the user to navigate simulation products efficiently. The data hub will be extended by adding more types of data products and by enabling computational capabilities to allow direct interactions with simulation results.
Cosmological cluster-scale strong gravitational lensing probes the mass distribution of the dense cores of massive dark matter halos and the structures along the line of sight from background sources to the observer. It is frequently assumed that the primary lens mass dominates the lensing, with the contribution of secondary masses along the line of sight being neglected. Secondary mass structures may, however, affect both the detectability of strong lensing in a given survey and modify the properties of the lensing that is detected. In this paper, we utilize a large cosmological N-body simulation and a multiple lens plane (and many source planes) ray-tracing technique to quantify the influence of line of sight halos on the detectability of cluster-scale strong lensing in a cluster sample with a mass limit that encompasses current cluster catalogs from the South Pole Telescope. We extract both primary and secondary halos from the Outer Rim simulation and consider two strong lensing realizations: one with only the primary halos included, and the other contains all secondary halos down to a mass limit. In both cases, we use the same source information extracted from the Hubble Ultra Deep Field, and create realistic lensed images consistent with moderately deep ground-based imaging. The results demonstrate that down to the mass limit considered the total number of lenses is boosted by about 13-21% when considering the complete multi-halo lightcone. The increment in strong lens counts peaks at lens redshifts of 0.6 approximately with no significant effect at z<0.3. The strongest trends are observed relative to the primary halo mass, with no significant impact in the most massive quintile of the halo sample, but increasingly boosting the observed lens counts toward small primary halo masses, with an enhancement greater than 50% in the least massive quintile of the halo masses considered.
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users to assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. In this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.
A strong instrumentation and detector R&D program has enabled the current generation of cosmic frontier surveys. A small investment in R&D will continue to pay dividends and enable new probes to investigate the accelerated expansion of the universe. Instrumentation and detector R&D provide critical training opportunities for future generations of experimentalists, skills that are important across the entire Department of Energy High Energy Physics program.
Cosmic surveys provide crucial information about high energy physics including strong evidence for dark energy, dark matter, and inflation. Ongoing and upcoming surveys will start to identify the underlying physics of these new phenomena, including t ight constraints on the equation of state of dark energy, the viability of modified gravity, the existence of extra light species, the masses of the neutrinos, and the potential of the field that drove inflation. Even after the Stage IV experiments, DESI and LSST, complete their surveys, there will still be much information left in the sky. This additional information will enable us to understand the physics underlying the dark universe at an even deeper level and, in case Stage IV surveys find hints for physics beyond the current Standard Model of Cosmology, to revolutionize our current view of the universe. There are many ideas for how best to supplement and aid DESI and LSST in order to access some of this remaining information and how surveys beyond Stage IV can fully exploit this regime. These ideas flow to potential projects that could start construction in the 2020s.
Precision measurements of the large scale structure of the Universe require large numbers of high fidelity mock catalogs to accurately assess, and account for, the presence of systematic effects. We introduce and test a scheme for generating mock cat alogs rapidly using suitably derated N-body simulations. Our aim is to reproduce the large scale structure and the gross properties of dark matter halos with high accuracy, while sacrificing the details of the internal structure of the halos. By adjusting global and local time-steps in an N-body code, we demonstrate that we recover halo masses to better than 0.5% and the power spectrum to better than 1% both in real and redshift space for k = 1h/Mpc, while requiring a factor of 4 less CPU time. We also calibrate the redshift spacing of outputs required to generate simulated light cones. We find that outputs separated by every z = 0.05 allow us to interpolate particle positions and velocities to reproduce the real and redshift space power spectra to better than 1% (out to k = 1h/Mpc). We apply these ideas to generate a suite of simulations spanning a range of cosmologies, motivated by the Baryon Oscillation Spectroscopic Survey (BOSS) but broadly applicable to future large scale structure surveys including eBOSS and DESI. As an initial demonstration of the utility of such simulations, we calibrate the shift in the baryonic acoustic oscillation peak position as a function of galaxy bias with higher precision than has been possible so far. This paper also serves to document the simulations, which we make publicly available.
Modeling large-scale sky survey observations is a key driver for the continuing development of high resolution, large-volume, cosmological simulations. We report the first results from the Q Continuum cosmological N-body simulation run carried out on the GPU-accelerated supercomputer Titan. The simulation encompasses a volume of (1300 Mpc)^3 and evolves more than half a trillion particles, leading to a particle mass resolution of ~1.5 X 10^8 M_sun. At this mass resolution, the Q Continuum run is currently the largest cosmology simulation available. It enables the construction of detailed synthetic sky catalogs, encompassing different modeling methodologies, including semi-analytic modeling and sub-halo abundance matching in a large, cosmological volume. Here we describe the simulation and outputs in detail and present first results for a range of cosmological statistics, such as mass power spectra, halo mass functions, and halo mass-concentration relations for different epochs. We also provide details on challenges connected to running a simulation on almost 90% of Titan, one of the fastest supercomputers in the world, including our usage of Titans GPU accelerators.
Dark matter-dominated cluster-scale halos act as an important cosmological probe and provide a key testing ground for structure formation theory. Focusing on their mass profiles, we have carried out (gravity-only) simulations of the concordance LCDM cosmology, covering a mass range of 2.10^{12}-2.10^{15} solar mass/h and a redshift range of z=0-2, while satisfying the associated requirements of resolution and statistical control. When fitting to the Navarro-Frenk-White profile, our concentration-mass (c-M) relation differs in normalization and shape in comparison to previous studies that have limited statistics in the upper end of the mass range. We show that the flattening of the c-M relation with redshift is naturally expressed if c is viewed as a function of the peak height parameter, u. Unlike the c-M relation, the slope of the c- u relation is effectively constant over the redshift range z=0-2, while the amplitude varies by ~30% for massive clusters. This relation is, however, not universal: Using a simulation suite covering the allowed wCDM parameter space, we show that the c- u relation varies by about +/- 20% as cosmological parameters are varied. At fixed mass, the c(M) distribution is well-fit by a Gaussian with sigma_c/c = 0.33, independent of the radius at which the concentration is defined, the halo dynamical state, and the underlying cosmology. We compare the LCDM predictions with observations of halo concentrations from strong lensing, weak lensing, galaxy kinematics, and X-ray data, finding good agreement for massive clusters (M > 4.10^{14} solar mass/h), but with some disagreements at lower masses. Because of uncertainty in observational systematics and modeling of baryonic physics, the significance of these discrepancies remains unclear.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا