Do you want to publish a course? Click here

The Complexity and Information Content of Simulated Universes

83   0   0.0 ( 0 )
 Added by Franco Vazza
 Publication date 2020
  fields Physics
and research's language is English
 Authors Franco Vazza




Ask ChatGPT about the research

The emergence of a complex, large-scale organisation of cosmic matter into the Cosmic Web is a beautiful exemplification of how complexity can be produced by simple initial conditions and simple physical laws. In the epoch of Big Data in astrophysics, connecting the stunning variety of multi-messenger observations to the complex interplay of fundamental physical processes is an open challenge. In this contribution, I discuss a few relevant applications of Information Theory to the task of objectively measuring the complexity of modern numerical simulations of the Universe. When applied to cosmological simulations, complexity analysis makes it possible to measure the total information necessary to model the cosmic web. It also allow us to monitor which physical processes are mostly responsible for the emergence of complex dynamical behaviour across cosmic epochs and environments, and possibly to improve mesh refinement strategies in the future.



rate research

Read More

399 - Julien Carron 2014
A large fraction of this thesis is dedicated to the study of the information content of random fields with heavy tails, in particular the lognormal field, a model for the matter density fluctuation field. It is well known that in the nonlinear regime of structure formation, the matter fluctuation field develops such large tails. It has also been suggested that fields with large tails are not necessarily well described by the hierarchy of $N$-point functions. In this thesis, we are able to make this last statement precise and with the help of the lognormal model to quantify precisely its implications for inference on cosmological parameters : we find as our main result that only a tiny fraction of the total Fisher information of the field is still contained in the hierarchy of $N$-point moments in the nonlinear regime, rendering parameter inference from such moments very inefficient. We show that the hierarchy fails to capture the information that is contained in the underdense regions, which at the same time are found to be the most rich in information. We find further our results to be very consistent with numerical analysis using $N$-body simulations. We also discuss these issues with the help of explicit families of fields with the same hierarchy of $N$-point moments defined in this work. A similar analysis is then applied to the convergence field, the weighted projection of the matter density fluctuation field along the line of sight, with similar conclusions. We also show how simple mappings can correct for this inadequacy, consistently with previous findings in the literature (Abridged) .
We use analytic computations to predict the power spectrum as well as the bispectrum of Cosmic Infrared Background (CIB) anisotropies. Our approach is based on the halo model and takes into account the mean luminosity-mass relation. The model is used to forecast the possibility to simultaneously constrain cosmological, CIB and halo occupation distribution (HOD) parameters in the presence of foregrounds. For the analysis we use wavelengths in eight frequency channels between 200 and 900$;mathrm{GHz}$ with survey specifications given by Planck and LiteBird. We explore the sensitivity to the model parameters up to multipoles of $ell =1000$ using auto- and cross-correlations between the different frequency bands. With this setting, cosmological, HOD and CIB parameters can be constrained to a few percent. Galactic dust is modeled by a power law and the shot noise contribution as a frequency dependent amplitude which are marginalized over. We find that dust residuals in the CIB maps only marginally influence constraints on standard cosmological parameters. Furthermore, the bispectrum yields tighter constraints (by a factor four in $1sigma$ errors) on almost all model parameters while the degeneracy directions are very similar to the ones of the power spectrum. The increase in sensitivity is most pronounced for the sum of the neutrino masses. Due to the similarity of degeneracies a combination of both analysis is not needed for most parameters. This, however, might be due to the simplified bias description generally adopted in such halo model approaches.
Cosmologists aim to model the evolution of initially low amplitude Gaussian density fluctuations into the highly non-linear cosmic web of galaxies and clusters. They aim to compare simulations of this structure formation process with observations of large-scale structure traced by galaxies and infer the properties of the dark energy and dark matter that make up 95% of the universe. These ensembles of simulations of billions of galaxies are computationally demanding, so that more efficient approaches to tracing the non-linear growth of structure are needed. We build a V-Net based model that transforms fast linear predictions into fully nonlinear predictions from numerical simulations. Our NN model learns to emulate the simulations down to small scales and is both faster and more accurate than the current state-of-the-art approximate methods. It also achieves comparable accuracy when tested on universes of significantly different cosmological parameters from the one used in training. This suggests that our model generalizes well beyond our training set.
Anisotropies in the cosmic microwave background (CMB) are primarily generated by Thomson scattering of photons by free electrons. Around recombination, the Thomson scattering probability quickly diminishes as the free electrons combine with protons to form neutral hydrogen off which CMB photons can scatter through Rayleigh scattering. Unlike Thomson scattering, Rayleigh scattering is frequency dependent resulting in the generation of anisotropies with a different spectral dependence. Unfortunately the Rayleigh scattering efficiency rapidly decreases with the expansion of the neutral universe, with the result that only a small percentage of photons are scattered by neutral hydrogen. Although the effect is very small, future CMB missions with higher sensitivity and improved frequency coverage are poised to measure Rayleigh scattering signal. The uncorrelated component of the Rayleigh anisotropies contains unique information on the primordial perturbations that could potentially be leveraged to expand our knowledge of the early universe. In this paper we explore whether measurements of Rayleigh scattering anisotropies can be used to constrain primordial non-Gaussianity (NG) and examine the hints of anomalies found by WMAP and textit{Planck} satellites. We show that the additional Rayleigh information has the potential to improve primordial NG constraints by $30%$, or more. Primordial bispectra that are not of the local type benefit the most from these additional scatterings, which we attribute to the different scale dependence of the Rayleigh anisotropies. Unfortunately this different scaling means that Rayleigh measurements can not be used to constrain anomalies or features on large scales. On the other hand, anomalies that may persist to smaller scales, such as the potential power asymmetry seen in WMAP and textit{Planck}, could be improved by the addition of Rayleigh measurements.
Aims: We evaluate the radial velocity (RV) information content and achievable precision on M0-M9 spectra covering the ZYJHK bands. We do so while considering both a perfect atmospheric transmission correction and discarding areas polluted by deep telluric features, as done in previous works. Methods: To simulate the M-dwarf spectra, PHOENIX-ACES model spectra were employed; they were convolved with rotational kernels and instrumental profiles to reproduce stars with a $v.sin{i}$ of 1.0, 5.0, and 10.0 km/s when observed at resolutions of 60 000, 80 000, and 100 000. We considered the RV precision as calculated on the whole spectra, after discarding strongly polluted areas, and after applying a perfect telluric correction. In our simulations we paid particular attention to the details of the convolution and sampling of the spectra, and we discuss their impact on the final spectra. Results: Our simulations show that the most important parameter ruling the difference in attainable precision between the considered bands is the spectral type. For M0-M3 stars, the bands that deliver the most precise RV measurements are the Z, Y, and H band, with relative merits depending on the parameters of the simulation. For M6-M9 stars, the bands show a difference in precision that is within a factor of $sim$2 and does not clearly depend on the band; this difference is reduced to a factor smaller than $sim$1.5 if we consider a non-rotating star seen at high resolution. We also show that an M6-M9 spectrum will deliver a precision about two times better as an M0-M3 spectra with the same signal-to-noise ratio. Finally, we note that the details of modelling the Earth atmosphere and interpreting the results have a significant impact on which wavelength regions are discarded when setting a limit threshold at 2-3%. (abridged)
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا