ترغب بنشر مسار تعليمي؟ اضغط هنا

Do we need soft cosmology?

222   0   0.0 ( 0 )
 نشر من قبل Emmanuil Saridakis
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We examine the possibility of soft cosmology, namely small deviations from the usual framework due to the effective appearance of soft-matter properties in the Universe sectors. One effect of such a case would be the dark energy to exhibit a different equation-of-state parameter at large scales (which determine the universe expansion) and at intermediate scales (which determine the sub-horizon clustering and the large scale structure formation). Concerning soft dark matter, we show that it can effectively arise due to the dark-energy clustering, even if dark energy is not soft. We propose a novel parametrization introducing the softness parameters of the dark sectors. As we see, although the background evolution remains unaffected, due to the extreme sensitivity and significant effects on the global properties even a slightly non-trivial softness parameter can improve the clustering behavior and alleviate e.g. the $fsigma_8$ tension. Lastly, an extension of the cosmological perturbation theory and a detailed statistical mechanical analysis, in order to incorporate complexity and estimate the scale-dependent behavior from first principles, is necessary and would provide a robust argumentation in favour of soft cosmology.

قيم البحث

اقرأ أيضاً

105 - Tomohiro Nakama , Yi Wang 2018
Recently, the formation of primordial black holes (PBHs) from the collapse of primordial fluctuations has received much attention. The abundance of PBHs formed during radiation domination is sensitive to the tail of the probability distribution of pr imordial fluctuations. We quantify the level of fine-tuning due to this sensitivity. For example, if the main source of dark matter is PBHs with mass $10^{-12}M_odot$, then anthropic reasoning suggests that the dark matter to baryon ratio should range between 1 and 300. For this to happen, the root-mean-square amplitude of the curvature perturbation has to be fine-tuned within a $7.1%$ range. As another example, if the recently detected gravitational-wave events are to be explained by PBHs, the corresponding degree of fine-tuning is $3.8%$. We also find, however, that these fine-tunings can be relaxed if the primordial fluctuations are highly non-Gaussian, or if the PBHs are formed during an early-matter-dominated phase. We also note that no fine-tuning is needed for the scenario of a reheating of the universe by evaporated PBHs with Planck-mass relics left to serve as dark matter.
Accurate inference of cosmology from weak lensing shear requires an accurate shear power spectrum covariance matrix. Here, we investigate this accuracy requirement and quantify the relative importance of the Gaussian (G), super-sample covariance (SSC ) and connected non-Gaussian (cNG) contributions to the covariance. Specifically, we forecast cosmological parameter constraints for future wide-field surveys and study how different covariance matrix components affect parameter bounds. Our main result is that the cNG term represents only a small and potentially negligible contribution to statistical parameter errors: the errors obtained using the G+SSC subset are within $lesssim 5%$ of those obtained with the full G+SSC+cNG matrix for a Euclid-like survey. This result also holds for the shear two-point correlation function, variations in survey specifications and for different analytical prescriptions of the cNG term. The cNG term is that which is often tackled using numerically expensive ensembles of survey realizations. Our results suggest however that the accuracy of analytical or approximate numerical methods to compute the cNG term is likely to be sufficient for cosmic shear inference from the next generation of surveys.
The intent recognition is an essential algorithm of any conversational AI application. It is responsible for the classification of an input message into meaningful classes. In many bot development platforms, we can configure the NLU pipeline. Several intent recognition services are currently available as an API, or we choose from many open-source alternatives. However, there is no comparison of intent recognition services and open-source algorithms. Many factors make the selection of the right approach to the intent recognition challenging in practice. In this paper, we suggest criteria to choose the best intent recognition algorithm for an application. We present a dataset for evaluation. Finally, we compare selected public NLU services with selected open-source algorithms for intent recognition.
We discuss the possibility to implement a viscous cosmological model, attributing to the dark matter component a behaviour described by bulk viscosity. Since bulk viscosity implies negative pressure, this rises the possibility to unify the dark secto r. At the same time, the presence of dissipative effects may alleviate the so called small scale problems in the $Lambda$CDM model. While the unified viscous description for the dark sector does not lead to consistent results, the non-linear behaviour indeed improves the situation with respect to the standard cosmological model.
We consider reciprocal metasurfaces with engineered reflection and transmission coefficients and study the role of normal (with respect to the metasurface plane) electric and magnetic polarizations on the possibilities to shape the reflection and tra nsmission responses. We demonstrate in general and on a representative example that the presence of normal components of the polarization vectors does not add extra degrees of freedom in engineering the reflection and transmission characteristics of metasurfaces. Furthermore, we discuss advantages and disadvantages of equivalent volumetric and fully planar realizations of the same properties of functional metasurfaces.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا