Do you want to publish a course? Click here

Constraining the dark energy models with H(z) data: an approach independent of $H_{0}$

118   0   0.0 ( 0 )
 Added by Spyros Basilakos
 Publication date 2017
  fields Physics
and research's language is English




Ask ChatGPT about the research

We study the performance of the latest $H(z)$ data in constraining the cosmological parameters of different cosmological models, including that of Chevalier-Polarski-Linder $w_{0}w_{1}$ parametrization. First, we introduce a statistical procedure in which the chi-square estimator is not affected by the value of the Hubble constant. As a result, we find that the $H(z)$ data do not rule out the possibility of either non-flat models or dynamical dark energy cosmological models. However, we verify that the time varying equation of state parameter $w(z)$ is not constrained by the current expansion data. Combining the $H(z)$ and the Type Ia supernova data we find that the $H(z)$/SNIa overall statistical analysis provides a substantial improvement of the cosmological constraints with respect to those of the $H(z)$ analysis. Moreover, the $w_{0}-w_{1}$ parameter space provided by the $H(z)$/SNIa joint analysis is in a very good agreement with that of Planck 2015, which confirms that the present analysis with the $H(z)$ and SNIa probes correctly reveals the expansion of the Universe as found by the team of Planck. Finally, we generate sets of Monte Carlo realizations in order to quantify the ability of the $H(z)$ data to provide strong constraints on the dark energy model parameters. The Monte Carlo approach shows significant improvement of the constraints, when increasing the sample to 100 $H(z)$ measurements. Such a goal can be achieved in the future, especially in the light of the next generation of surveys.



rate research

Read More

The differential age data of astrophysical objects that have evolved passivelly during the history of the universe (e.g. red galaxies) allows to test theoretical cosmological models through the predicted Hubble function expressed in terms of the redshift $z$, $H(z)$. We use the observational data for $H(z)$ to test unified scenarios for dark matter and dark energy. Specifically, we focus our analysis on the Generalized Chaplygin Gas (GCG) and the viscous fluid (VF) models. For the GCG model, it is shown that the unified scenario for dark energy and dark matter requires some priors. For the VF model we obtain estimations for the free parameters that may be compared with further analysis mainly at perturbative level.
99 - Ming Zhang , Bo Wang , Peng-Ju Wu 2021
We forecast constraints on cosmological parameters in the interacting dark energy models using the mock data generated for neutral hydrogen intensity mapping (IM) experiments. In this work, we only consider the interacting dark energy models with energy transfer rate $Q=beta Hrho_{rm c}$, and take BINGO, FAST, SKA1-MID, and Tianlai as typical examples of the 21 cm IM experiments. We find that the Tianlai cylinder array will play an important role in constraining the interacting dark energy model. Assuming perfect foreground removal and calibration, and using the Tianlai-alone data, we obtain $sigma(H_0)=0.19$ km s$^{-1}$ Mpc$^{-1}$, $sigma(Omega_{rm m})=0.0033$ and $sigma(sigma_8)=0.0033$ in the I$Lambda$CDM model, which are much better than the results of Planck+optical BAO (i.e. optical galaxy surveys). However, the Tianlai-alone data cannot provide a very tight constraint on the coupling parameter $beta$ compared with Planck+optical BAO, while the Planck+Tianlai data can give a rather tight constraint of $sigma(beta)=0.00023$ due to the parameter degeneracies being well broken by the data combination. In the I$w$CDM model, we obtain $sigma(beta)=0.00079$ and $sigma(w)=0.013$ from Planck+Tianlai. In addition, we also make a detailed comparison among BINGO, FAST, SKA1-MID, and Tianlai in constraining the interacting dark energy models. We show that future 21 cm IM experiments will provide a useful tool for exploring the nature of dark energy and play a significant role in measuring the coupling between dark energy and dark matter.
An axion-like field comprising $sim 10%$ of the energy density of the universe near matter-radiation equality is a candidate to resolve the Hubble tension; this is the early dark energy (EDE) model. However, as shown in Hill et al. (2020), the model fails to simultaneously resolve the Hubble tension and maintain a good fit to both cosmic microwave background (CMB) and large-scale structure (LSS) data. Here, we use redshift-space galaxy clustering data to sharpen constraints on the EDE model. We perform the first EDE analysis using the full-shape power spectrum likelihood from the Baryon Oscillation Spectroscopic Survey (BOSS), based on the effective field theory (EFT) of LSS. The inclusion of this likelihood in the EDE analysis yields a $25%$ tighter error bar on $H_0$ compared to primary CMB data alone, yielding $H_0 = 68.54^{+0.52}_{-0.95}$ km/s/Mpc ($68%$ CL). In addition, we constrain the maximum fractional energy density contribution of the EDE to $f_{rm EDE} < 0.072$ ($95%$ CL). We explicitly demonstrate that the EFT BOSS likelihood yields much stronger constraints on EDE than the standard BOSS likelihood. Including further information from photometric LSS surveys,the constraints narrow by an additional $20%$, yielding $H_0 = 68.73^{+0.42}_{-0.69}$ km/s/Mpc ($68%$ CL) and $f_{rm EDE}<0.053$ ($95%$ CL). These bounds are obtained without including local-universe $H_0$ data, which is in strong tension with the CMB and LSS, even in the EDE model. We also refute claims that MCMC analyses of EDE that omit SH0ES from the combined dataset yield misleading posteriors. Finally, we demonstrate that upcoming Euclid/DESI-like spectroscopic galaxy surveys can greatly improve the EDE constraints. We conclude that current data preclude the EDE model as a resolution of the Hubble tension, and that future LSS surveys can close the remaining parameter space of this model.
$Om(z)$ is a diagnostic approach to distinguish dark energy models. However, there are few articles to discuss what is the distinguishing criterion. In this paper, firstly we smooth the latest observational $H(z)$ data using a model-independent method -- Gaussian processes, and then reconstruct the $Om(z)$ and its fist order derivative $mathcal{L}^{(1)}_m$. Such reconstructions not only could be the distinguishing criteria, but also could be used to estimate the authenticity of models. We choose some popular models to study, such as $Lambda$CDM, generalized Chaplygin gas (GCG) model, Chevallier-Polarski-Linder (CPL) parametrization and Jassal-Bagla-Padmanabhan (JBP) parametrization. We plot the trajectories of $Om(z)$ and $mathcal{L}^{(1)}_m$ with $1 sigma$ confidence level of these models, and compare them to the reconstruction from $H(z)$ data set. The result indicates that the $H(z)$ data does not favor the CPL and JBP models at $1 sigma$ confidence level. Strangely, in high redshift range, the reconstructed $mathcal{L}^{(1)}_m$ has a tendency of deviation from theoretical value, which demonstrates these models are disagreeable with high redshift $H(z)$ data. This result supports the conclusions of Sahni et al. citep{sahni2014model} and Ding et al. citep{ding2015there} that the $Lambda$CDM may not be the best description of our universe.
156 - V. C. Busti , R. C. Santos 2011
In this Comment we discuss a recent analysis by Yu et al. [RAA 11, 125 (2011)] about constraints on the smoothness $alpha$ parameter and dark energy models using observational $H(z)$ data. It is argued here that their procedure is conceptually inconsistent with the basic assumptions underlying the adopted Dyer-Roeder approach. In order to properly quantify the influence of the $H(z)$ data on the smoothness $alpha$ parameter, a $chi^2$-test involving a sample of SNe Ia and $H(z)$ data in the context of a flat $Lambda$CDM model is reanalyzed. This result is confronted with an earlier approach discussed by Santos et al. (2008) without $H(z)$ data. In the ($Omega_m, alpha$) plane, it is found that such parameters are now restricted on the intervals $0.66 leq alpha leq 1.0$ and $0.27 leq Omega_m leq 0.37$ within 95.4% confidence level (2$sigma$), and, therefore, fully compatible with the homogeneous case. The basic conclusion is that a joint analysis involving $H(z)$ data can indirectly improve our knowledge about the influence of the inhomogeneities. However, this happens only because the $H(z)$ data provide tighter constraints on the matter density parameter $Omega_m$.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا