ترغب بنشر مسار تعليمي؟ اضغط هنا

Large deviation approach to nonequilibrium systems

128   0   0.0 ( 0 )
 نشر من قبل Rosemary Harris
 تاريخ النشر 2011
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The theory of large deviations has been applied successfully in the last 30 years or so to study the properties of equilibrium systems and to put the foundations of equilibrium statistical mechanics on a clearer and more rigorous footing. A similar approach has been followed more recently for nonequilibrium systems, especially in the context of interacting particle systems. We review here the basis of this approach, emphasizing the similarities and differences that exist between the application of large deviation theory for studying equilibrium systems on the one hand and nonequilibrium systems on the other. Of particular importance are the notions of macroscopic, hydrodynamic, and long-time limits, which are analogues of the equilibrium thermodynamic limit, and the notion of statistical ensembles which can be generalized to nonequilibrium systems. For the purpose of illustrating our discussion, we focus on applications to Markov processes, in particular to simple random walks.


قيم البحث

اقرأ أيضاً

70 - Stefano Gherardini 2019
The exact statistics of an arbitrary quantum observable is analytically obtained. Due to the probabilistic nature of a sequence of intermediate measurements and stochastic fluctuations induced by the interaction with the environment, the measurement outcomes at the end of the systems evolution are random variables. Here, we provide the exact large-deviation form of their probability distribution, which is given by an exponentially decaying profile in the number of measurements. The most probable distribution of the measurement outcomes in a single realization of the system transformation is then derived, thus achieving predictions beyond the expectation value. The theoretical results are confirmed by numerical simulations of an experimentally reproducible two-level system with stochastic Hamiltonian.
The standard Large Deviation Theory (LDT) represents the mathematical counterpart of the Boltzmann-Gibbs factor which describes the thermal equilibrium of short-range Hamiltonian systems, the velocity distribution of which is Maxwellian. It is generi cally applicable to systems satisfying the Central Limit Theorem (CLT). When we focus instead on stationary states of typical complex systems (e.g., classical long-range Hamiltonian systems), both the CLT and LDT need to be generalized. Specifically, when the N->infinity attractor in the space of distributions is a Q-Gaussian related to a Q-generalized CLT (Q=1 recovers Gaussian attractors), we expect the LDT probability distribution to approach a q-exponential (where q=f(Q) with f(1)=1, thus recovering the standard LDT exponential distribution) with an argument proportional to N, consistently with thermodynamics. We numerically verify this conjectural scenario for the standard map, the coherent noise model for biological extinctions and earthquakes, the Ehrenfest dog-flea model, and the random-walk avalanches.
The paper that is commented by Touchette contains a computational study which opens the door to a desirable generalization of the standard large deviation theory (applicable to a set of $N$ nearly independent random variables) to systems belonging to a special, though ubiquitous, class of strong correlations. It focuses on three inter-related aspects, namely (i) we exhibit strong numerical indications which suggest that the standard exponential probability law is asymptotically replaced by a power-law as its dominant term for large $N$; (ii) the subdominant term appears to be consistent with the $q$-exponential behavior typical of systems following $q$-statistics, thus reinforcing the thermodynamically extensive entropic nature of the exponent of the $q$-exponential, basically $N$ times the $q$-generalized rate function; (iii) the class of strong correlations that we have focused on corresponds to attractors in the sense of the Central Limit Theorem which are $Q$-Gaussian distributions (in principle $1 < Q < 3$), which relevantly differ from (symmetric) Levy distributions, with the unique exception of Cauchy-Lorentz distributions (which correspond to $Q = 2$), where they coincide, as well known. In his Comment, Touchette has agreeably discussed point (i), but, unfortunately, points (ii) and (iii) have, as we detail here, visibly escaped to his analysis. Consequently, his conclusion claiming the absence of special connection with $q$-exponentials is unjustified.
We analyse dynamical large deviations of quantum trajectories in Markovian open quantum systems in their full generality. We derive a {em quantum level-2.5 large deviation principle} for these systems, which describes the joint fluctuations of time-a veraged quantum jump rates and of the time-averaged quantum state for long times. Like its level-2.5 counterpart for classical continuous-time Markov chains (which it contains as a special case) this description is both {em explicit and complete}, as the statistics of arbitrary time-extensive dynamical observables can be obtained by contraction from the explicit level-2.5 rate functional we derive. Our approach uses an unravelled representation of the quantum dynamics which allows these statistics to be obtained by analysing a classical stochastic process in the space of pure states. For quantum reset processes we show that the unravelled dynamics is semi-Markov, and derive bounds on the asymptotic variance of the number of quantum jumps which generalise classical thermodynamic uncertainty relations. We finish by discussing how our level-2.5 approach can be used to study large deviations of non-linear functions of the state such as measures of entanglement.
The theory of large deviations constitutes a mathematical cornerstone in the foundations of Boltzmann-Gibbs statistical mechanics, based on the additive entropy $S_{BG}=- k_Bsum_{i=1}^W p_i ln p_i$. Its optimization under appropriate constraints yiel ds the celebrated BG weight $e^{-beta E_i}$. An elementary large-deviation connection is provided by $N$ independent binary variables, which, in the $Ntoinfty$ limit yields a Gaussian distribution. The probability of having $n e N/2$ out of $N$ throws is governed by the exponential decay $e^{-N r}$, where the rate function $r$ is directly related to the relative BG entropy. To deal with a wide class of complex systems, nonextensive statistical mechanics has been proposed, based on the nonadditive entropy $S_q=k_Bfrac{1- sum_{i=1}^W p_i^q}{q-1}$ ($q in {cal R}; ,S_1=S_{BG}$). Its optimization yields the generalized weight $e_q^{-beta_q E_i}$ ($e_q^z equiv [1+(1-q)z]^{1/(1-q)};,e_1^z=e^z)$. We numerically study large deviations for a strongly correlated model which depends on the indices $Q in [1,2)$ and $gamma in (0,1)$. This model provides, in the $Ntoinfty$ limit ($forall gamma$), $Q$-Gaussian distributions, ubiquitously observed in nature ($Qto 1$ recovers the independent binary model). We show that its corresponding large deviations are governed by $e_q^{-N r_q}$ ($propto 1/N^{1/(q-1)}$ if $q>1$) where $q= frac{Q-1}{gamma (3-Q)}+1 ge 1$. This $q$-generalized illustration opens wide the door towards a desirable large-deviation foundation of nonextensive statistical mechanics.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا