ترغب بنشر مسار تعليمي؟ اضغط هنا

Property of Tsallis entropy and principle of entropy increase

130   0   0.0 ( 0 )
 نشر من قبل Jiulin Du
 تاريخ النشر 2008
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Jiulin Du




اسأل ChatGPT حول البحث

The property of Tsallis entropy is examined when considering tow systems with different temperatures to be in contact with each other and to reach the thermal equilibrium. It is verified that the total Tsallis entropy of the two systems cannot decrease after the contact of the systems. We derived an inequality for the change of Tsallis entropy in such an example, which leads to a generalization of the principle of entropy increase in the framework of nonextensive statistical mechanics.

قيم البحث

اقرأ أيضاً

We apply the Principle of Maximum Entropy to the study of a general class of deterministic fractal sets. The scaling laws peculiar to these objects are accounted for by means of a constraint concerning the average content of information in those patt erns. This constraint allows for a new statistical characterization of fractal objects and fractal dimension.
A principle of hierarchical entropy maximization is proposed for generalized superstatistical systems, which are characterized by the existence of three levels of dynamics. If a generalized superstatistical system comprises a set of superstatistical subsystems, each made up of a set of cells, then the Boltzmann-Gibbs-Shannon entropy should be maximized first for each cell, second for each subsystem, and finally for the whole system. Hierarchical entropy maximization naturally reflects the sufficient time-scale separation between different dynamical levels and allows one to find the distribution of both the intensive parameter and the control parameter for the corresponding superstatistics. The hierarchical maximum entropy principle is applied to fluctuations of the photon Bose-Einstein condensate in a dye microcavity. This principle provides an alternative to the master equation approach recently applied to this problem. The possibility of constructing generalized superstatistics based on a statistics different from the Boltzmann-Gibbs statistics is pointed out.
There are three ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for s tatistical inference on multinomial Bernoulli processes (Jaynes maximum entropy principle). Even though these notions are fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, $H(p)=-sum_i p_ilog p_i$. For many complex systems, which are typically history-dependent, non-ergodic and non-multinomial, this is no longer the case. Here we show that for such processes the three entropy concepts lead to different functional forms of entropy. We explicitly compute these entropy functionals for three concrete examples. For Polya urn processes, which are simple self-reinforcing processes, the source information rate is $S_{rm IT}=frac{1}{1-c}frac1N log N$, the thermodynamical (extensive) entropy is $(c,d)$-entropy, $S_{rm EXT}=S_{(c,0)}$, and the entropy in the maxent principle (MEP) is $S_{rm MEP}(p)=-sum_i log p_i$. For sample space reducing (SSR) processes, which are simple path-dependent processes that are associated with power law statistics, the information rate is $S_{rm IT}=1+ frac12 log W$, the extensive entropy is $S_{rm EXT}=H(p)$, and the maxent result is $S_{rm MEP}(p)=H(p/p_1)+H(1-p/p_1)$. Finally, for multinomial mixture processes, the information rate is given by the conditional entropy $langle Hrangle_f$, with respect to the mixing kernel $f$, the extensive entropy is given by $H$, and the MEP functional corresponds one-to-one to the logarithm of the mixing kernel.
354 - Ian J. Ford 2015
The selection of an equilibrium state by maximising the entropy of a system, subject to certain constraints, is often powerfully motivated as an exercise in logical inference, a procedure where conclusions are reached on the basis of incomplete infor mation. But such a framework can be more compelling if it is underpinned by dynamical arguments, and we show how this can be provided by stochastic thermodynamics, where an explicit link is made between the production of entropy and the stochastic dynamics of a system coupled to an environment. The separation of entropy production into three components allows us to select a stationary state by maximising the change, averaged over all realisations of the motion, in the principal relaxational or nonadiabatic component, equivalent to requiring that this contribution to the entropy production should become time independent for all realisations. We show that this recovers the usual equilibrium probability density function (pdf) for a conservative system in an isothermal environment, as well as the stationary nonequilibrium pdf for a particle confined to a potential under nonisothermal conditions, and a particle subject to a constant nonconservative force under isothermal conditions. The two remaining components of entropy production account for a recently discussed thermodynamic anomaly between over- and underdamped treatments of the dynamics in the nonisothermal stationary state.
We study the nonextensive thermodynamics for open systems. On the basis of the maximum entropy principle, the dual power-law q-distribution functions are re-deduced by using the dual particle number definitions and assuming that the chemical potentia l is constant in the two sets of parallel formalisms, where the fundamental thermodynamic equations with dual interpretations of thermodynamic quantities are derived for the open systems. By introducing parallel structures of Legendre transformations, other thermodynamic equations with dual interpretations of quantities are also deduced in the open systems, and then several dual thermodynamic relations are inferred. One can easily find that there are correlations between the dual relations, from which an equivalent rule is found that the Tsallis factor is invariable in calculations of partial derivative with constant volume or constant entropy. Using this rule, more correlations can be found. And the statistical expressions of the Lagrange internal energy and pressure are easily obtained.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا