ترغب بنشر مسار تعليمي؟ اضغط هنا

Cosmological neutrino simulations at extreme scale

332   0   0.0 ( 0 )
 نشر من قبل J.D. Emberson
 تاريخ النشر 2016
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Constraining neutrino mass remains an elusive challenge in modern physics. Precision measurements are expected from several upcoming cosmological probes of large-scale structure. Achieving this goal relies on an equal level of precision from theoretical predictions of neutrino clustering. Numerical simulations of the non-linear evolution of cold dark matter and neutrinos play a pivotal role in this process. We incorporate neutrinos into the cosmological N-body code CUBEP3M and discuss the challenges associated with pushing to the extreme scales demanded by the neutrino problem. We highlight code optimizations made to exploit modern high performance computing architectures and present a novel method of data compression that reduces the phase-space particle footprint from 24 bytes in single precision to roughly 9 bytes. We scale the neutrino problem to the Tianhe-2 supercomputer and provide details of our production run, named TianNu, which uses 86% of the machine (13,824 compute nodes). With a total of 2.97 trillion particles, TianNu is currently the worlds largest cosmological N-body simulation and improves upon previous neutrino simulations by two orders of magnitude in scale. We finish with a discussion of the unanticipated computational challenges that were encountered during the TianNu runtime.

قيم البحث

اقرأ أيضاً

120 - Jia Liu 2017
The non-zero mass of neutrinos suppresses the growth of cosmic structure on small scales. Since the level of suppression depends on the sum of the masses of the three active neutrino species, the evolution of large-scale structure is a promising tool to constrain the total mass of neutrinos and possibly shed light on the mass hierarchy. In this work, we investigate these effects via a large suite of N-body simulations that include massive neutrinos using an analytic linear-response approximation: the Cosmological Massive Neutrino Simulations (MassiveNuS). The simulations include the effects of radiation on the background expansion, as well as the clustering of neutrinos in response to the nonlinear dark matter evolution. We allow three cosmological parameters to vary: the neutrino mass sum M_nu in the range of 0-0.6 eV, the total matter density Omega_m, and the primordial power spectrum amplitude A_s. The rms density fluctuation in spheres of 8 comoving Mpc/h (sigma_8) is a derived parameter as a result. Our data products include N-body snapshots, halo catalogues, merger trees, ray- traced galaxy lensing convergence maps for four source redshift planes between z_s=1-2.5, and ray-traced cosmic microwave background lensing convergence maps. We describe the simulation procedures and code validation in this paper. The data are publicly available at http://columbialensing.org.
We describe the first major public data release from cosmological simulations carried out with Argonnes HACC code. This initial release covers a range of datasets from large gravity-only simulations. The data products include halo information for mul tiple redshifts, down-sampled particles, and lightcone outputs. We provide data from two very large LCDM simulations as well as beyond-LCDM simulations spanning eleven w0-wa cosmologies. Our release platform uses Petrel, a research data service, located at the Argonne Leadership Computing Facility. Petrel offers fast data transfer mechanisms and authentication via Globus, enabling simple and efficient access to stored datasets. Easy browsing of the available data products is provided via a web portal that allows the user to navigate simulation products efficiently. The data hub will be extended by adding more types of data products and by enabling computational capabilities to allow direct interactions with simulation results.
To exploit the power of next-generation large-scale structure surveys, ensembles of numerical simulations are necessary to give accurate theoretical predictions of the statistics of observables. High-fidelity simulations come at a towering computatio nal cost. Therefore, approximate but fast simulations, surrogates, are widely used to gain speed at the price of introducing model error. We propose a general method that exploits the correlation between simulations and surrogates to compute fast, reduced-variance statistics of large-scale structure observables without model error at the cost of only a few simulations. We call this approach Convergence Acceleration by Regression and Pooling (CARPool). In numerical experiments with intentionally minimal tuning, we apply CARPool to a handful of GADGET-III $N$-body simulations paired with surrogates computed using COmoving Lagrangian Acceleration (COLA). We find $sim 100$-fold variance reduction even in the non-linear regime, up to $k_mathrm{max} approx 1.2$ $h {rm Mpc^{-1}}$ for the matter power spectrum. CARPool realises similar improvements for the matter bispectrum. In the nearly linear regime CARPool attains far larger sample variance reductions. By comparing to the 15,000 simulations from the Quijote suite, we verify that the CARPool estimates are unbiased, as guaranteed by construction, even though the surrogate misses the simulation truth by up to $60%$ at high $k$. Furthermore, even with a fully configuration-space statistic like the non-linear matter density probability density function, CARPool achieves unbiased variance reduction factors of up to $sim 10$, without any further tuning. Conversely, CARPool can be used to remove model error from ensembles of fast surrogates by combining them with a few high-accuracy simulations.
We present the online MultiDark Database -- a Virtual Observatory-oriented, relational database for hosting various cosmological simulations. The data is accessible via an SQL (Structured Query Language) query interface, which also allows users to di rectly pose scientific questions, as shown in a number of examples in this paper. Further examples for the usage of the database are given in its extensive online documentation (www.multidark.org). The database is based on the same technology as the Millennium Database, a fact that will greatly facilitate the usage of both suites of cosmological simulations. The first release of the MultiDark Database hosts two 8.6 billion particle cosmological N-body simulations: the Bolshoi (250/h Mpc simulation box, 1/h kpc resolution) and MultiDark Run1 simulation (MDR1, or BigBolshoi, 1000/h Mpc simulation box, 7/h kpc resolution). The extraction methods for halos/subhalos from the raw simulation data, and how this data is structured in the database are explained in this paper. With the first data release, users get full access to halo/subhalo catalogs, various profiles of the halos at redshifts z=0-15, and raw dark matter data for one time-step of the Bolshoi and four time-steps of the MultiDark simulation. Later releases will also include galaxy mock catalogs and additional merging trees for both simulations as well as new large volume simulations with high resolution. This project is further proof of the viability to store and present complex data using relational database technology. We encourage other simulators to publish their results in a similar manner.
We perform a suite of multimass cosmological zoom simulations of individual dark matter halos and explore how to best select Lagrangian regions for resimulation without contaminating the halo of interest with low-resolution particles. Such contaminat ion can lead to significant errors in the gas distribution of hydrodynamical simulations, as we show. For a fixed Lagrange volume, we find that the chance of contamination increases systematically with the level of zoom. In order to avoid contamination, the Lagrangian volume selected for resimulation must increase monotonically with the resolution difference between parent box and the zoom region. We provide a simple formula for selecting Lagrangian regions (in units of the halo virial volume) as a function of the level of zoom required. We also explore the degree to which a halos Lagrangian volume correlates with other halo properties (concentration, spin, formation time, shape, etc.) and find no significant correlation. There is a mild correlation between Lagrange volume and environment, such that halos living in the most clustered regions have larger Lagrangian volumes. Nevertheless, selecting halos to be isolated is not the best way to ensure inexpensive zoom simulations. We explain how one can safely choose halos with the smallest Lagrangian volumes, which are the least expensive to resimulate, without biasing ones sample.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا