ترغب بنشر مسار تعليمي؟ اضغط هنا

Gibbs Paradox in the View of Information Entropy

113   0   0.0 ( 0 )
 نشر من قبل Xiao Xu
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Xiao Xu




اسأل ChatGPT حول البحث

This paper introduces the basic concepts of information theory. Based on these concepts, we regard the states in the state space and the types of ideal gases as the symbols in a symbol set to calculate the mixing entropy of ideal gas involved in Gibbs Paradox. The discussion above reveals that the non-need for distinguishing can resolve the contradiction of Gibbs Paradox, implying the introduction of indistinguishability is not necessary. Further analysis shows that the information entropy of gas molecular types does not directly correlate to the energy of a gas system, so it should not be used for calculating thermodynamic and statistical dynamic entropies. Therefore, the mixing entropy of the ideal gas is independent of the molecular types and is much smaller than the value commonly thought.

قيم البحث

اقرأ أيضاً

We study the propagation of entanglement after quantum quenches in the non-integrable para-magnetic quantum Ising spin chain. Tuning the parameters of the system, we observe a sudden increase in the entanglement production rate, which we show to be r elated to the appearance of new quasi-particle excitations in the post-quench spectrum. We argue that the phenomenon is the non-equilibrium version of the well-known Gibbs paradox related to mixing entropy and demonstrate that its characteristics fit the expectations derived from the quantum resolution of the paradox in systems with a non-trivial quasi-particle spectrum.
The Gibbs entropy of a macroscopic classical system is a function of a probability distribution over phase space, i.e., of an ensemble. In contrast, the Boltzmann entropy is a function on phase space, and is thus defined for an individual system. Our aim is to discuss and compare these two notions of entropy, along with the associated ensemblist and individualist views of thermal equilibrium. Using the Gibbsian ensembles for the computation of the Gibbs entropy, the two notions yield the same (leading order) values for the entropy of a macroscopic system in thermal equilibrium. The two approaches do not, however, necessarily agree for non-equilibrium systems. For those, we argue that the Boltzmann entropy is the one that corresponds to thermodynamic entropy, in particular in connection with the second law of thermodynamics. Moreover, we describe the quantum analog of the Boltzmann entropy, and we argue that the individualist (Boltzmannian) concept of equilibrium is supported by the recent works on thermalization of closed quantum systems.
63 - Bo-Bo Wei 2017
In this work, we show that the dissipation in a many-body system under an arbitrary non-equilibrium process is related to the R{e}nyi divergences between two states along the forward and reversed dynamics under very general family of initial conditio ns. This relation generalizes the links between dissipated work and Renyi divergences to quantum systems with conserved quantities whose equilibrium state is described by the generalized Gibbs ensemble. The relation is applicable for quantum systems with conserved quantities and can be applied to protocols driving the system between integrable and chaotic regimes. We demonstrate our ideas by considering the one-dimensional transverse quantum Ising model which is driven out of equilibrium by the instantaneous switching of the transverse magnetic field.
We generalize the convex duality symmetry in Gibbs statistical ensemble formulation, between Massieus free entropy $Phi_{V,N} (beta)$ and the Gibbs entropy $varphi_{V,N}(u)$ as a function of mean internal energy $u$. The duality tells us that Gibbs t hermodynamic entropy is to the law of large numbers (LLN) for arithmetic sample means what Shannons information entropy is to the LLN for empirical counting frequencies. Following the same logic, we identify $u$ as the conjugate variable to counting frequency, a Hamilton-Jacobi equation for Shannon entropy as an equation of state, and suggest an eigenvalue problem for modeling statistical frequencies of correlated data.
In this work, Gibbs paradox was discussed from the view of observer. The limitations of real observer are analyzed quantitatively. The entropy of mixing was found to be determined by both the identification ability and the information already in hand of an observer.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا