Do you want to publish a course? Click here

Gibbs Paradox in the View of Information Entropy

113   0   0.0 ( 0 )
 Added by Xiao Xu
 Publication date 2021
  fields Physics
and research's language is English
 Authors Xiao Xu




Ask ChatGPT about the research

This paper introduces the basic concepts of information theory. Based on these concepts, we regard the states in the state space and the types of ideal gases as the symbols in a symbol set to calculate the mixing entropy of ideal gas involved in Gibbs Paradox. The discussion above reveals that the non-need for distinguishing can resolve the contradiction of Gibbs Paradox, implying the introduction of indistinguishability is not necessary. Further analysis shows that the information entropy of gas molecular types does not directly correlate to the energy of a gas system, so it should not be used for calculating thermodynamic and statistical dynamic entropies. Therefore, the mixing entropy of the ideal gas is independent of the molecular types and is much smaller than the value commonly thought.



rate research

Read More

We study the propagation of entanglement after quantum quenches in the non-integrable para-magnetic quantum Ising spin chain. Tuning the parameters of the system, we observe a sudden increase in the entanglement production rate, which we show to be related to the appearance of new quasi-particle excitations in the post-quench spectrum. We argue that the phenomenon is the non-equilibrium version of the well-known Gibbs paradox related to mixing entropy and demonstrate that its characteristics fit the expectations derived from the quantum resolution of the paradox in systems with a non-trivial quasi-particle spectrum.
The Gibbs entropy of a macroscopic classical system is a function of a probability distribution over phase space, i.e., of an ensemble. In contrast, the Boltzmann entropy is a function on phase space, and is thus defined for an individual system. Our aim is to discuss and compare these two notions of entropy, along with the associated ensemblist and individualist views of thermal equilibrium. Using the Gibbsian ensembles for the computation of the Gibbs entropy, the two notions yield the same (leading order) values for the entropy of a macroscopic system in thermal equilibrium. The two approaches do not, however, necessarily agree for non-equilibrium systems. For those, we argue that the Boltzmann entropy is the one that corresponds to thermodynamic entropy, in particular in connection with the second law of thermodynamics. Moreover, we describe the quantum analog of the Boltzmann entropy, and we argue that the individualist (Boltzmannian) concept of equilibrium is supported by the recent works on thermalization of closed quantum systems.
63 - Bo-Bo Wei 2017
In this work, we show that the dissipation in a many-body system under an arbitrary non-equilibrium process is related to the R{e}nyi divergences between two states along the forward and reversed dynamics under very general family of initial conditions. This relation generalizes the links between dissipated work and Renyi divergences to quantum systems with conserved quantities whose equilibrium state is described by the generalized Gibbs ensemble. The relation is applicable for quantum systems with conserved quantities and can be applied to protocols driving the system between integrable and chaotic regimes. We demonstrate our ideas by considering the one-dimensional transverse quantum Ising model which is driven out of equilibrium by the instantaneous switching of the transverse magnetic field.
We generalize the convex duality symmetry in Gibbs statistical ensemble formulation, between Massieus free entropy $Phi_{V,N} (beta)$ and the Gibbs entropy $varphi_{V,N}(u)$ as a function of mean internal energy $u$. The duality tells us that Gibbs thermodynamic entropy is to the law of large numbers (LLN) for arithmetic sample means what Shannons information entropy is to the LLN for empirical counting frequencies. Following the same logic, we identify $u$ as the conjugate variable to counting frequency, a Hamilton-Jacobi equation for Shannon entropy as an equation of state, and suggest an eigenvalue problem for modeling statistical frequencies of correlated data.
In this work, Gibbs paradox was discussed from the view of observer. The limitations of real observer are analyzed quantitatively. The entropy of mixing was found to be determined by both the identification ability and the information already in hand of an observer.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا