No Arabic abstract
By utilizing large-scale graph analytic tools implemented in the modern Big Data platform, Apache Spark, we investigate the topological structure of gravitational clustering in five different universes produced by cosmological $N$-body simulations with varying parameters: (1) a WMAP 5-year compatible $Lambda$CDM cosmology, (2) two different dark energy equation of state variants, and (3) two different cosmic matter density variants. For the Big Data calculations, we use a custom build of stand-alone Spark/Hadoop cluster at Korea Institute for Advanced Study (KIAS) and Dataproc Compute Engine in Google Cloud Platform (GCP) with the sample size ranging from 7 millions to 200 millions. We find that among the many possible graph-topological measures, three simple ones: (1) the average of number of neighbors (the so-called average vertex degree) $alpha$, (2) closed-to-connected triple fraction (the so-called transitivity) $tau_Delta$, and (3) the cumulative number density $n_{sge5}$ of subcomponents with connected component size $s ge 5$, can effectively discriminate among the five model universes. Since these graph-topological measures are in direct relation with the usual $n$-points correlation functions of the cosmic density field, graph-topological statistics powered by Big Data computational infrastructure opens a new, intuitive, and computationally efficient window into the dark Universe.
We reexamine big bang nucleosynthesis with large-scale baryon density inhomogeneities when the length scale of the density fluctuations exceeds the neutron diffusion length ($sim 10^7-10^8$ cm at BBN), and the amplitude of the fluctuations is sufficiently small to prevent gravitational collapse. In this limit, the final light element abundances can be determined by simply mixing the abundances from regions with different baryon/photon ratios without interactions. We examine gaussian, lognormal, and gamma distributions for the baryon/photon ratio, $eta $. We find that the deuterium and lithium-7 abundances increase with the RMS fluctuation in $eta $, while the effect on helium-4 is much smaller. We show that these increases in the deuterium and lithium-7 abundances are a consequence of Jensens inequality, and we derive analytic approximations for these abundances in the limit of small RMS fluctuations. Observational upper limits on the primordial deuterium abundance constrain the RMS fluctuation in $eta $ to be less than $17%$ of the mean value of $eta $. This provides us with a new limit on the graininess of the early universe.
We study models in which neutrino masses are generated dynamically at cosmologically late times. Our study is purely phenomenological and parameterized in terms of three effective parameters characterizing the redshift of mass generation, the width of the transition region, and the present day neutrino mass. We also study the possibility that neutrinos become strongly self-interacting at the time where the mass is generated. We find that in a number of cases, models with large present day neutrino masses are allowed by current CMB, BAO and supernova data. The increase in the allowed mass range makes it possible that a non-zero neutrino mass could be measured in direct detection experiments such as KATRIN. Intriguingly we also find that there are allowed models in which neutrinos become strongly self-interacting around the epoch of recombination.
Upcoming surveys such as LSST{} and Euclid{} will significantly improve the power of weak lensing as a cosmological probe. To maximise the information that can be extracted from these surveys, it is important to explore novel statistics that complement standard weak lensing statistics such as the shear-shear correlation function and peak counts. In this work, we use a recently proposed weak lensing observable -- weak lensing voids -- to make parameter constraint forecasts for an LSST-like survey. We use the cosmoslics{} $w$CDM simulation suite to measure void statistics as a function of cosmological parameters. The simulation data is used to train a Gaussian process regression emulator that we use to generate likelihood contours and provide parameter constraints from mock observations. We find that the void abundance is more constraining than the tangential shear profiles, though the combination of the two gives additional constraining power. We forecast that without tomographic decomposition, these void statistics can constrain the matter fluctuation amplitude, $S_8$ within 0.3% (68% confidence interval), while offering 1.5, 1.5 and 2.7% precision on the matter density parameter, $Omega_{rm m}$, the reduced Hubble constant, $h$, and the dark energy equation of state parameter, $w_0$, respectively. These results are tighter than the constraints from the shear-shear correlation function with the same observational specifications for $Omega_m$, $S_8$ and $w_0$. The constraints from the WL voids also have complementary parameter degeneracy directions to the shear 2PCF for all combinations of parameters that include $h$, making weak lensing void statistics a promising cosmological probe.
We discuss experimental constraints on the free parameter of the nonextensive kinetic theory from measurements of the thermal dispersion relation in a collisionless plasma. For electrostatic plane-wave propagation, we show through a statistical analysis that a good agreement between theory and experiment is possible if the allowed values of the $q$-parameter are restricted by $q=0.77 pm 0.03$ at 95% confidence level (or equivalently, $2-q = 1.23$, in the largely adopted convention for the entropy index $q$). Such a result rules out (by a large statistical margin) the standard Bohm-Gross dispersion relation which is derived assuming that the stationary Maxwellian distribution ($q=1$) is the unperturbed solution.
We investigate the idea that current cosmic acceleration could be the consequence of gravitational leakage into extra dimensions on cosmological scales rather than the result of a non-zero cosmological constant, and consider the ability of future gravitational-wave siren observations to probe this phenomenon and constrain the parameters of phenomenological models of this gravitational leakage. In theories that include additional non-compact spacetime dimensions, the gravitational leakage intro extra dimensions leads to a reduction in the amplitude of observed gravitational waves and thereby a systematic discrepancy between the distance inferred to such sources from GW and EM observations. We investigate the capability of a gravitational space interferometer such as LISA to probe this modified gravity on large scales. We find that the extent to which LISA will be able to place limits on the number of spacetime dimensions and other cosmological parameters characterising modified gravity will strongly depend on the actual number and redshift distribution of sources, together with the uncertainty on the GW measurements. A relatively small number of sources ($sim 1$) and high measurement uncertainties would strongly restrict the ability of LISA to place meaningful constraints on the parameters in cosmological scenarios where gravity is only five-dimensional and modified at scales larger than about $sim 4$ times the Hubble radius. Conversely, if the number of sources observed amounts to a four-year average of $sim 27$, then in the most favourable cosmological scenarios LISA has the potential to place meaningful constraints on the cosmological parameters with a precision of $sim 1%$ on the number of dimensions and $sim 7.5%$ on the scale beyond which gravity is modified, thereby probing the late expansion of the universe up to a redshift of $sim 8$.