Do you want to publish a course? Click here

Simulating cosmic reionization: How large a volume is large enough?

96   0   0.0 ( 0 )
 Added by Ilian Iliev
 Publication date 2013
  fields Physics
and research's language is English




Ask ChatGPT about the research

We present the largest-volume (425 Mpc/h=607 Mpc on a side) full radiative transfer simulation of cosmic reionization to date. We show that there is significant additional power in density fluctuations at very large scales. We systematically investigate the effects this additional power has on the progress, duration and features of reionization, as well as on selected reionization observables. We find that comoving simulation volume of ~100 Mpc/h per side is sufficient for deriving a convergent mean reionization history, but that the reionization patchiness is significantly underestimated. We use jackknife splitting to quantify the convergence of reionization properties with simulation volume for both mean-density and variable-density sub-regions. We find that sub-volumes of ~100 Mpc/h per side or larger yield convergent reionization histories, except for the earliest times, but smaller volumes of ~50 Mpc/h or less are not well converged at any redshift. Reionization history milestones show significant scatter between the sub-volumes, of Delta z=0.6-1 for ~50 Mpc/h volumes, decreasing to Delta z=0.3-0.5 for ~100 Mpc/h volumes, and $Delta z$~0.1 for ~200 Mpc/h volumes. If we only consider mean-density sub-regions the scatter decreases, but remains at Delta z~0.1-0.2 for the different size sub-volumes. Consequently, many potential reionization observables like 21-cm rms, 21-cm PDF skewness and kurtosis all show good convergence for volumes of ~200 Mpc/h, but retain considerable scatter for smaller volumes. In contrast, the three-dimensional 21-cm power spectra at large scales (k<0.25 h/Mpc) do not fully converge for any sub-volume size. These additional large-scale fluctuations significantly enhance the 21-cm fluctuations, which should improve the prospects of detection considerably, given the lower foregrounds and greater interferometer sensitivity at higher frequencies. (abridged)



rate research

Read More

67 - Ilian T. Iliev 2005
We present the first large-scale radiative transfer simulations of cosmic reionization, in a simulation volume of (100/h Mpc)^3, while at the same time capturing the dwarf galaxies which are primarily responsible for reionization. We achieve this by combining the results from extremely large, cosmological, N-body simulations with a new, fast and efficient code for 3D radiative transfer, C^2-Ray. The resulting electron-scattering optical depth is in good agreement with the first-year WMAP polarization data. We show that reionization clearly proceeded in an inside-out fashion, with the high-density regions being ionized earlier, on average, than the voids. Ionization histories of smaller-size (5 to 10 comoving Mpc) subregions exibit a large scatter about the mean and do not describe the global reionization history well. The minimum reliable volume size for such predictions is ~30 Mpc. We derive the power-spectra of the neutral, ionized and total gas density fields and show that there is a significant boost of the density fluctuations in both the neutral and the ionized components relative to the total at arcminute and larger scales. We find two populations of HII regions according to their size, numerous, mid-sized (~10 Mpc) regions and a few, rare, very large regions tens of Mpc in size. We derive the statistical distributions of the ionized fraction and ionized gas density at various scales and for the first time show that both distributions are clearly non-Gaussian. (abridged)
109 - Ilian T. Iliev 2008
The Cosmic Dark Ages and the Epoch of Reionization constitute a crucial missing link in our understanding of the evolution of the intergalactic medium and the formation and evolution of galaxies. Due to the complex nature of this global process it is best studied through large-scale numerical simulations. This presents considerable computational challenges. The dominant contributors of ionizing radiation were dwarf galaxies. These tiny galaxies must be resolved in very large cosmological volumes in order to derive their clustering properties and the corresponding observational signatures correctly, which makes this one of the most challenging problems of numerical cosmology. We have recently performed the largest and most detailed simulations of the formation of early cosmological large-scale structures and their radiative feedback leading to cosmic reionization. This was achieved by running extremely large (up to 29 billion-particle) N-body simulations of the formation of the Cosmic Web, with enough particles and sufficient force resolution to resolve all the galactic halos with total masses larger than 10^8 Solar masses in computational volumes of up to (163 Mpc)^3. These results were then post-processed by propagating the ionizing radiation from all sources by using fast and accurate ray-tracing radiative transfer method. Both of our codes are parallelized using a combination of MPI and OpenMP and to this date have been run efficiently on up to 2048 cores (N-body) and up to 10000 cores (radiative transfer) on the newly-deployed Sun Constellation Linux Cluster at the Texas Advanced Computing Center. In this paper we describe our codes, parallelization strategies, scaling and some preliminary scientific results. (abridged)
161 - M. G. Santos 2009
While limited to low spatial resolution, the next generation low-frequency radio interferometers that target 21 cm observations during the era of reionization and prior will have instantaneous fields-of-view that are many tens of square degrees on the sky. Predictions related to various statistical measurements of the 21 cm brightness temperature must then be pursued with numerical simulations of reionization with correspondingly large volume box sizes, of order 1000 Mpc on one side. We pursue a semi-numerical scheme to simulate the 21 cm signal during and prior to Reionization by extending a hybrid approach where simulations are performed by first laying down the linear dark matter density field, accounting for the non-linear evolution of the density field based on second-order linear perturbation theory as specified by the Zeldovich approximation, and then specifying the location and mass of collapsed dark matter halos using the excursion-set formalism. The location of ionizing sources and the time evolving distribution of ionization field is also specified using an excursion-set algorithm. We account for the brightness temperature evolution through the coupling between spin and gas temperature due to collisions, radiative coupling in the presence of Lyman-alpha photons and heating of the intergalactic medium, such as due to a background of X-ray photons. The hybrid simulation method we present is capable of producing the required large volume simulations with adequate resolution in a reasonable time so a large number of realizations can be obtained with variations in assumptions related to astrophysics and background cosmology that govern the 21 cm signal.
280 - F.Vazza 2019
The growth of large-scale cosmic structure is a beautiful exemplification of how complexity can emerge in our Universe, starting from simple initial conditions and simple physical laws. Using {enzo} cosmological numerical simulations, I applied tools from Information Theory (namely, statistical complexity) to quantify the amount of complexity in the simulated cosmic volume, as a function of cosmic epoch and environment. This analysis can quantify how much difficult to predict, at least in a statistical sense, is the evolution of the thermal, kinetic and magnetic energy of the dominant component of ordinary matter in the Universe (the intragalactic medium plasma). The most complex environment in the simulated cosmic web is generally found to be the periphery of large-scale structures (e.g. galaxy clusters and filaments), where the complexity is on average $sim 10-10^2$ times larger than in more rarefied regions, even if the latter dominate the volume-integrated complexity of the simulated Universe. If the energy evolution of gas in the cosmic web is measured on a $approx 100 $ $rm kpc/h$ resolution and over a $approx 200$ $rm Myr$ timescale, its total complexity is the range of $sim 10^{16}-10^{17} rm ~bits$, with little dependence on the assumed gas physics, cosmology or cosmic variance.
Traditional large-scale models of reionization usually employ simple deterministic relations between halo mass and luminosity to predict how reionization proceeds. We here examine the impact on modelling reionization of using more detailed models for the ionizing sources as identified within the $100~{rm Mpc/h}$ cosmological hydrodynamic simulation Simba, coupled with post-processed radiative transfer. Comparing with simple (one-to-one) models, the main difference of using Simba sources is the scatter in the relation between dark matter halos and star formation, and hence ionizing emissivity. We find that, at the power spectrum level, the ionization morphology remains mostly unchanged, regardless of the variability in the number of sources or escape fraction. Our results show that simplified models of ionizing sources remain viable to efficiently model the structure of reionization on cosmological scales, although the precise progress of reionization requires accounting for the scatter induced by astrophysical effects.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا