ترغب بنشر مسار تعليمي؟ اضغط هنا

Mutual information change in feedback processes driven by measurement

68   0   0.0 ( 0 )
 نشر من قبل Chulan Kwon
 تاريخ النشر 2016
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Chulan Kwon




اسأل ChatGPT حول البحث

We investigate thermodynamics of feedback processes driven by measurement. Regarding system and memory device as a composite system, mutual information as a measure of correlation between the two constituents contributes to the entropy of the composite system, which makes the generalized total entropy of the joint system and reservoir satisfy the second law of thermodynamics. We investigate the thermodynamics of the Szilard engine for an intermediate period before the completion of cycle. We show the second law to hold resolving the paradox of Maxwells demon independent of the period taken into account. We also investigate a feedback process to confine a particle excessively within a trap, which is operated by repetitions of feedback in a finite time interval. We derive the stability condition for multi-step feedback and find the condition for confinement below thermal fluctuation in the absence of feedback. The results are found to depend on interval between feedback steps and intensity of feedback protocol, which are expected to be important parameters in real experiments.



قيم البحث

اقرأ أيضاً

135 - Jaegon Um , Hyunggyu Park , 2012
The quantum entanglement $E$ of a bipartite quantum Ising chain is compared with the mutual information $I$ between the two parts after a local measurement of the classical spin configuration. As the model is conformally invariant, the entanglement m easured in its ground state at the critical point is known to obey a certain scaling form. Surprisingly, the mutual information of classical spin configurations is found to obey the same scaling form, although with a different prefactor. Moreover, we find that mutual information and the entanglement obey the inequality $Ileq E$ in the ground state as well as in a dynamically evolving situation. This inequality holds for general bipartite systems in a pure state and can be proven using similar techniques as for Holevos bound.
We study dynamical reversibility in stationary stochastic processes from an information theoretic perspective. Extending earlier work on the reversibility of Markov chains, we focus on finitary processes with arbitrarily long conditional correlations . In particular, we examine stationary processes represented or generated by edge-emitting, finite-state hidden Markov models. Surprisingly, we find pervasive temporal asymmetries in the statistics of such stationary processes with the consequence that the computational resources necessary to generate a process in the forward and reverse temporal directions are generally not the same. In fact, an exhaustive survey indicates that most stationary processes are irreversible. We study the ensuing relations between model topology in different representations, the processs statistical properties, and its reversibility in detail. A processs temporal asymmetry is efficiently captured using two canonical unifilar representations of the generating model, the forward-time and reverse-time epsilon-machines. We analyze example irreversible processes whose epsilon-machine presentations change size under time reversal, including one which has a finite number of recurrent causal states in one direction, but an infinite number in the opposite. From the forward-time and reverse-time epsilon-machines, we are able to construct a symmetrized, but nonunifilar, generator of a process---the bidirectional machine. Using the bidirectional machine, we show how to directly calculate a processs fundamental information properties, many of which are otherwise only poorly approximated via process samples. The tools we introduce and the insights we offer provide a better understanding of the many facets of reversibility and irreversibility in stochastic processes.
We study the finite-temperature behavior of the Lipkin-Meshkov-Glick model, with a focus on correlation properties as measured by the mutual information. The latter, which quantifies the amount of both classical and quantum correlations, is computed exactly in the two limiting cases of vanishing magnetic field and vanishing temperature. For all other situations, numerical results provide evidence of a finite mutual information at all temperatures except at criticality. There, it diverges as the logarithm of the system size, with a prefactor that can take only two values, depending on whether the critical temperature vanishes or not. Our work provides a simple example in which the mutual information appears as a powerful tool to detect finite-temperature phase transitions, contrary to entanglement measures such as the concurrence.
We study the statistical properties of jump processes in a bounded domain that are driven by Poisson white noise. We derive the corresponding Kolmogorov-Feller equation and provide a general representation for its stationary solutions. Exact stationa ry solutions of this equation are found and analyzed in two particular cases. All our analytical findings are confirmed by numerical simulations.
We quantify the emergent complexity of quantum states near quantum critical points on regular 1D lattices, via complex network measures based on quantum mutual information as the adjacency matrix, in direct analogy to quantifying the complexity of EE G/fMRI measurements of the brain. Using matrix product state methods, we show that network density, clustering, disparity, and Pearsons correlation obtain the critical point for both quantum Ising and Bose-Hubbard models to a high degree of accuracy in finite-size scaling for three classes of quantum phase transitions, $Z_2$, mean field superfluid/Mott insulator, and a BKT crossover.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا