ترغب بنشر مسار تعليمي؟ اضغط هنا

The MultiDark Database: Release of the Bolshoi and MultiDark Cosmological Simulations

145   0   0.0 ( 0 )
 نشر من قبل Harry Enke
 تاريخ النشر 2011
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present the online MultiDark Database -- a Virtual Observatory-oriented, relational database for hosting various cosmological simulations. The data is accessible via an SQL (Structured Query Language) query interface, which also allows users to directly pose scientific questions, as shown in a number of examples in this paper. Further examples for the usage of the database are given in its extensive online documentation (www.multidark.org). The database is based on the same technology as the Millennium Database, a fact that will greatly facilitate the usage of both suites of cosmological simulations. The first release of the MultiDark Database hosts two 8.6 billion particle cosmological N-body simulations: the Bolshoi (250/h Mpc simulation box, 1/h kpc resolution) and MultiDark Run1 simulation (MDR1, or BigBolshoi, 1000/h Mpc simulation box, 7/h kpc resolution). The extraction methods for halos/subhalos from the raw simulation data, and how this data is structured in the database are explained in this paper. With the first data release, users get full access to halo/subhalo catalogs, various profiles of the halos at redshifts z=0-15, and raw dark matter data for one time-step of the Bolshoi and four time-steps of the MultiDark simulation. Later releases will also include galaxy mock catalogs and additional merging trees for both simulations as well as new large volume simulations with high resolution. This project is further proof of the viability to store and present complex data using relational database technology. We encourage other simulators to publish their results in a similar manner.

قيم البحث

اقرأ أيضاً

We present the public release of the MultiDark-Galaxies: three distinct galaxy catalogues derived from one of the Planck cosmology MultiDark simulations (i.e. MDPL2, with a volume of (1 Gpc/$h$)$^{3}$ and mass resolution of $1.5 times 10^{9} M_{odot} /h$) by applying the semi-analytic models GALACTICUS, SAG, and SAGE to it. We compare the three models and their conformity with observational data for a selection of fundamental properties of galaxies like stellar mass function, star formation rate, cold gas fractions, and metallicities - noting that they sometimes perform differently reflecting model designs and calibrations. We have further selected galaxy subsamples of the catalogues by number densities in stellar mass, cold gas mass, and star formation rate in order to study the clustering statistics of galaxies. We show that despite different treatment of orphan galaxies, i.e. galaxies that lost their dark-matter host halo due to the finite mass resolution of the N-body simulation or tidal stripping, the clustering signal is comparable, and reproduces the observations in all three models - in particular when selecting samples based upon stellar mass. Our catalogues provide a powerful tool to study galaxy formation within a volume comparable to those probed by on-going and future photometric and redshift surveys. All model data consisting of a range of galaxy properties - including broad-band SDSS magnitudes - are publicly available.
61 - G. Favole 2019
We use three semi-analytic models (SAMs) of galaxy formation and evolution, run on the same 1$h^{-1}$Gpc MultiDark Planck2 cosmological simulation, to investigate the properties of [OII] emission line galaxies in the redshift range $0.6<z<1.2$. We co mpare model predictions with different observational data sets, including DEEP2--Firefly galaxies with absolute magnitudes. We estimate the [OII] luminosity, L[OII], using simple relations derived both from the models and observations and also using a public code. This code ideally uses as input instantaneous star formation rates (SFRs), which are only provided by one of the SAMs under consideration. We use this SAM to study the feasibility of inferring galaxies L[OII] for models that only provide average SFRs. We find that the post-processing computation of L[OII] from average SFRs is accurate for model galaxies with dust attenuated L[OII]$lesssim10^{42.2}$erg s$^{-1}$ ($<5%$ discrepancy). We also explore how to derive the [OII] luminosity from simple relations using global properties usually output by SAMs. Besides the SFR, the model L[OII] is best correlated with the observed-frame $u$ and $g$ broad-band magnitudes. These correlations have coefficients (r-values) above 0.64 and a dispersion that varies with L[OII]. We use these correlations and an observational one based on SFR and metallicity to derive L[OII]. These relations result in [OII] luminosity functions and halo occupation distributions with shapes that vary depending on both the model and the method used. Nevertheless, for all the considered models, the amplitude of the clustering at scales above 1$h^{-1}$Mpc remains unchanged independently of the method used to derive L[OII].
We describe the first major public data release from cosmological simulations carried out with Argonnes HACC code. This initial release covers a range of datasets from large gravity-only simulations. The data products include halo information for mul tiple redshifts, down-sampled particles, and lightcone outputs. We provide data from two very large LCDM simulations as well as beyond-LCDM simulations spanning eleven w0-wa cosmologies. Our release platform uses Petrel, a research data service, located at the Argonne Leadership Computing Facility. Petrel offers fast data transfer mechanisms and authentication via Globus, enabling simple and efficient access to stored datasets. Easy browsing of the available data products is provided via a web portal that allows the user to navigate simulation products efficiently. The data hub will be extended by adding more types of data products and by enabling computational capabilities to allow direct interactions with simulation results.
Cosmological simulations are fundamental tools to study structure formation and the astrophysics of evolving structures, in particular clusters of galaxies. While hydrodynamical simulations cannot sample efficiently large volumes and explore differen t cosmologies at the same time, N-body simulations lack the baryonic physics that is crucial to determine the observed properties of clusters. One solution is to use (semi-)analytical models to implement the needed baryonic physics. In this way, we can generate the many mock universes that will be required to fully exploit future large sky surveys, such as that from the upcoming eROSITA X-ray telescope. We developed a phenomenological model based on observations of clusters to implement gas density and temperature information on the dark-matter-only halos of the MultiDark simulations. We generate several full-sky mock light-cones of clusters for the WMAP and Planck cosmologies, adopting different parameters in our phenomenological model of the intra-cluster medium. For one of these simulations and models, we also generate 100 light-cones corresponding to 100 random observers and explore the variance among them in several quantities. In this first paper on MultiDark mock galaxy cluster light-cones, we focus on presenting our methodology and discuss predictions for eROSITA, in particular, exploring the potential of angular power spectrum analyses of its detected (and undetected) cluster population to study X-ray scaling relations, the intra-cluster medium, and the composition of the cosmic X-ray background. We make publicly available on-line more than 400 GB of light-cones, which include the expected eROSITA count rate, on Skies & Universes (http://www.skiesanduniverses.org).
The Dark Sky Simulations are an ongoing series of cosmological N-body simulations designed to provide a quantitative and accessible model of the evolution of the large-scale Universe. Such models are essential for many aspects of the study of dark ma tter and dark energy, since we lack a sufficiently accurate analytic model of non-linear gravitational clustering. In July 2014, we made available to the general community our early data release, consisting of over 55 Terabytes of simulation data products, including our largest simulation to date, which used $1.07 times 10^{12}~(10240^3)$ particles in a volume $8h^{-1}mathrm{Gpc}$ across. Our simulations were performed with 2HOT, a purely tree-based adaptive N-body method, running on 200,000 processors of the Titan supercomputer, with data analysis enabled by yt. We provide an overview of the derived halo catalogs, mass function, power spectra and light cone data. We show self-consistency in the mass function and mass power spectrum at the 1% level over a range of more than 1000 in particle mass. We also present a novel method to distribute and access very large datasets, based on an abstraction of the World Wide Web (WWW) as a file system, remote memory-mapped file access semantics, and a space-filling curve index. This method has been implemented for our data release, and provides a means to not only query stored results such as halo catalogs, but also to design and deploy new analysis techniques on large distributed datasets.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا