ترغب بنشر مسار تعليمي؟ اضغط هنا

Shannon Information Entropy in Heavy-ion Collisions

101   0   0.0 ( 0 )
 نشر من قبل Chun-Wang Ma
 تاريخ النشر 2018
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

The general idea of information entropy provided by C.E. Shannon hangs over everything we do and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have been deeply developed in many scientific areas including physics. The dynamical properties of heavy-ion collisions (HICs) process make it difficult and complex to study the nuclear matter and its evolution, for which Shannon information entropy theory can provide new methods and observables to understand the physical phenomena both theoretically and experimentally. To better understand the processes of HICs, the main characteristics of typical models, including the quantum molecular dynamics models, thermodynamics models, and statistical models, etc, are briefly introduced. The typical applications of Shannon information theory in HICs are collected, which cover the chaotic behavior in branching process of hadron collisions, the liquid-gas phase transition in HICs, and the isobaric difference scaling phenomenon for intermediate mass fragments produced in HICs of neutron-rich systems. Even though the present applications in heavy-ion collision physics are still relatively simple, it would shed light on key questions we are seeking for. It is suggested to further develop the information entropy methods in nuclear reactions models, as well as to develop new analysis methods to study the properties of nuclear matters in HICs, especially the evolution of dynamics system.



قيم البحث

اقرأ أيضاً

Feature selection, in the context of machine learning, is the process of separating the highly predictive feature from those that might be irrelevant or redundant. Information theory has been recognized as a useful concept for this task, as the predi ction power stems from the correlation, i.e., the mutual information, between features and labels. Many algorithms for feature selection in the literature have adopted the Shannon-entropy-based mutual information. In this paper, we explore the possibility of using Renyi min-entropy instead. In particular, we propose an algorithm based on a notion of conditional Renyi min-entropy that has been recently adopted in the field of security and privacy, and which is strictly related to the Bayes error. We prove that in general the two approaches are incomparable, in the sense that we show that we can construct datasets on which the Renyi-based algorithm performs better than the corresponding Shannon-based one, and datasets on which the situation is reversed. In practice, however, when considering datasets of real data, it seems that the Renyi-based algorithm tends to outperform the other one. We have effectuate several experiments on the BASEHOCK, SEMEION, and GISETTE datasets, and in all of them we have indeed observed that the Renyi-based algorithm gives better results.
We propose a model for isotropization and corresponding thermalization in a nucleon system created in the collision of two nuclei. The model is based on the assumption: during the fireball evolution, two-particle elastic and inelastic collisions give rise to the randomization of the nucleon-momentum transfer which is driven by a proper distribution. As a first approximation, we assume a homogeneous distribution where the values of the momentum transfer is bounded from above. These features have been shown to result in a smearing of the particle momenta about their initial values and, as a consequence, in their partial isotropization and thermalization. The nonequilibrium single-particle distribution function and single-particle spectrum which carry a memory about initial state of nuclei have been obtained.
We study the event-by-event generation of flow vorticity in RHIC Au + Au collisions and LHC Pb + Pb collisions by using the HIJING model. Different definitions of the vorticity field and velocity field are considered. A variety of properties of the v orticity are explored, including the impact parameter dependence, the collision energy dependence, the spatial distribution, the event-by-event fluctuation of the magnitude and azimuthal direction, and the time evolution. In addition, the spatial distribution of the flow helicity is also studied.
144 - Mike Lisa 2016
The study of high energy collisions between heavy nuclei is a field unto itself, distinct from nuclear and particle physics. A defining aspect of heavy ion physics is the importance of a bulk, self-interacting system with a rich space-time substructu re. I focus on the issue of timescales in heavy ion collisions, starting with proof from low-energy collisions that femtoscopy can, indeed, measure very long timescales. I then discuss the relativistic case, where detailed measurements over three orders of magnitude in energy reveal a timescale increase that might be due to a first-order phase transition. I discuss also consistency in evolution timescales as determined from traditional longitudinal sizes and a novel analysis using shape information.
We study charm production in ultra-relativistic heavy-ion collisions by using the Parton-Hadron-String Dynamics (PHSD) transport approach. The initial charm quarks are produced by the PYTHIA event generator tuned to fit the transverse momentum spectr um and rapidity distribution of charm quarks from Fixed-Order Next-to-Leading Logarithm (FONLL) calculations. The produced charm quarks scatter in the quark-gluon plasma (QGP) with the off-shell partons whose masses and widths are given by the Dynamical Quasi-Particle Model (DQPM), which reproduces the lattice QCD equation-of-state in thermal equilibrium. The relevant cross sections are calculated in a consistent way by employing the effective propagators and couplings from the DQPM. Close to the critical energy density of the phase transition, the charm quarks are hadronized into $D$ mesons through coalescence and/or fragmentation. The hadronized $D$ mesons then interact with the various hadrons in the hadronic phase with cross sections calculated in an effective lagrangian approach with heavy-quark spin symmetry. The nuclear modification factor $R_{AA}$ and the elliptic flow $v_2$ of $D^0$ mesons from PHSD are compared with the experimental data from the STAR Collaboration for Au+Au collisions at $sqrt{s_{NN}}$ =200 GeV and to the ALICE data for Pb+Pb collisions at $sqrt{s_{NN}}$ =2.76 TeV. We find that in the PHSD the energy loss of $D$ mesons at high $p_T$ can be dominantly attributed to partonic scattering while the actual shape of $R_{AA}$ versus $p_T$ reflects the heavy-quark hadronization scenario, i.e. coalescence versus fragmentation. Also the hadronic rescattering is important for the $R_{AA}$ at low $p_T$ and enhances the $D$-meson elliptic flow $v_2$.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا