ترغب بنشر مسار تعليمي؟ اضغط هنا

Do we need to know the temperature in prestellar cores?

97   0   0.0 ( 0 )
 نشر من قبل Dmitri Wiebe
 تاريخ النشر 2007
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Molecular line observations of starless (prestellar) cores combined with a chemical evolution modeling and radiative transfer calculations are a powerful tool to study the earliest stages of star formation. However, conclusions drawn from such a modeling may noticeably depend on the assumed thermal structure of the cores. The assumption of isothermality, which may work well in chemo-dynamical studies, becomes a critical factor in molecular line formation simulations. We argue that even small temperature variations, which are likely to exist in starless cores, can have a non-negligible effect on the interpretation of molecular line data and derived core properties. In particular, ``chemically pristine isothermal cores (low depletion) can have centrally peaked C$^{18}$O and C$^{34}$S radial intensity profiles, while having ring-like intensity distributions in models with a colder center and/or warmer envelope assuming the same underlying chemical structure. Therefore, derived molecular abundances based on oversimplified thermal models may lead to a mis-interpretation of the line data.

قيم البحث

اقرأ أيضاً

121 - Jiri J. Mares 2016
Temperature, the central concept of thermal physics, is one of the most frequently employed physical quantities in common practice. Even though the operative methods of the temperature measurement are described in detail in various practical instruct ions and textbooks, the rigorous treatment of this concept is almost lacking in the current literature. As a result, the answer to a simple question of what the temperature is is by no means trivial and unambiguous. There is especially an appreciable gap between the temperature as introduced in the frame of statistical theory and the only experimentally observable quantity related to this concept, phenomenological temperature. Just the logical and epistemological analysis of the present concept of phenomenological temperature is the kernel of the contribution.
We examine the possibility of soft cosmology, namely small deviations from the usual framework due to the effective appearance of soft-matter properties in the Universe sectors. One effect of such a case would be the dark energy to exhibit a differen t equation-of-state parameter at large scales (which determine the universe expansion) and at intermediate scales (which determine the sub-horizon clustering and the large scale structure formation). Concerning soft dark matter, we show that it can effectively arise due to the dark-energy clustering, even if dark energy is not soft. We propose a novel parametrization introducing the softness parameters of the dark sectors. As we see, although the background evolution remains unaffected, due to the extreme sensitivity and significant effects on the global properties even a slightly non-trivial softness parameter can improve the clustering behavior and alleviate e.g. the $fsigma_8$ tension. Lastly, an extension of the cosmological perturbation theory and a detailed statistical mechanical analysis, in order to incorporate complexity and estimate the scale-dependent behavior from first principles, is necessary and would provide a robust argumentation in favour of soft cosmology.
In the present paper, we investigate the cosmographic problem using the bias-variance trade-off. We find that both the z-redshift and the $y=z/(1+z)$-redshift can present a small bias estimation. It means that the cosmography can describe the superno va data more accurately. Minimizing risk, it suggests that cosmography up to the second order is the best approximation. Forecasting the constraint from future measurements, we find that future supernova and redshift drift can significantly improve the constraint, thus having the potential to solve the cosmographic problem. We also exploit the values of cosmography on the deceleration parameter and equation of state of dark energy $w(z)$. We find that supernova cosmography cannot give stable estimations on them. However, much useful information was obtained, such as that the cosmography favors a complicated dark energy with varying $w(z)$, and the derivative $dw/dz<0$ for low redshift. The cosmography is helpful to model the dark energy.
159 - Qiang Sun 2021
This paper studies robust mean estimators for distributions with only finite variances. We propose a new loss function that is a function of the mean parameter and a robustification parameter. By simultaneously optimizing the empirical loss with resp ect to both parameters, we show that the resulting estimator for the robustification parameter can automatically adapt to the data and the unknown variance. Thus the resulting mean estimator can achieve near-optimal finite-sample performance. Compared with prior work, our method is computationally efficient and user-friendly. It does not need cross-validation to tune the robustification parameter.
The intent recognition is an essential algorithm of any conversational AI application. It is responsible for the classification of an input message into meaningful classes. In many bot development platforms, we can configure the NLU pipeline. Several intent recognition services are currently available as an API, or we choose from many open-source alternatives. However, there is no comparison of intent recognition services and open-source algorithms. Many factors make the selection of the right approach to the intent recognition challenging in practice. In this paper, we suggest criteria to choose the best intent recognition algorithm for an application. We present a dataset for evaluation. Finally, we compare selected public NLU services with selected open-source algorithms for intent recognition.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا