ﻻ يوجد ملخص باللغة العربية
We propose a generalized entropy maximization procedure, which takes into account the generalized averaging procedures and information gain definitions underlying the generalized entropies. This novel generalized procedure is then applied to Renyi and Tsallis entropies. The generalized entropy maximization procedure for Renyi entropies results in the exponential stationary distribution asymptotically for q is between [0,1] in contrast to the stationary distribution of the inverse power law obtained through the ordinary entropy maximization procedure. Another result of the generalized entropy maximization procedure is that one can naturally obtain all the possible stationary distributions associated with the Tsallis entropies by employing either ordinary or q-generalized Fourier transforms in the averaging procedure.
A single bit memory system is made with a brownian particle held by an optical tweezer in a double-well potential and the work necessary to erase the memory is measured. We show that the minimum of this work is close to the Landauers bound only for v
We study a generalization of the voter model on complex networks, focusing on the scaling of mean exit time. Previous work has defined the voter model in terms of an initially chosen node and a randomly chosen neighbor, which makes it difficult to di
Comb geometry, constituted of a backbone and fingers, is one of the most simple paradigm of a two dimensional structure, where anomalous diffusion can be realized in the framework of Markov processes. However, the intrinsic properties of the structur
Systems with long-range interactions display a short-time relaxation towards Quasi Stationary States (QSS) whose lifetime increases with the system size. In the paradigmatic Hamiltonian Mean-field Model (HMF) out-of-equilibrium phase transitions are
Entropy plays a key role in statistical physics of complex systems, which in general exhibit diverse aspects of emergence on different scales. However, it still remains not fully resolved how entropy varies with the coarse-graining level and the desc