ترغب بنشر مسار تعليمي؟ اضغط هنا

Maxwells demon and the management of ignorance in stochastic thermodynamics

136   0   0.0 ( 0 )
 نشر من قبل Ian Ford
 تاريخ النشر 2015
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Ian J. Ford




اسأل ChatGPT حول البحث

It is nearly 150 years since Maxwell challenged the validity of the second law of thermodynamics by imagining a tiny creature who could sort the molecules of a gas in such a way that would decrease entropy without exerting any work. The demon has been discussed largely using thought experiments, but it has recently become possible to exert control over nanoscale systems, just as Maxwell imagined, and the status of the second law has become a more practical matter, raising the issue of how measurements manage our ignorance in a way that can be exploited. The framework of stochastic thermodynamics extends macroscopic concepts such as heat, work, entropy and irreversibility to small systems and allows us explore the matter. Some arguments against a successful demon imply a second law that can be suspended indefinitely until we dissipate energy in order to remove the records of his operations. In contrast, under stochastic thermodynamics the demon fails because on average more work is performed upfront in making a measurement than is to be extracted by exploiting the outcome. This requires us to exclude systems and a demon that evolve under what might be termed self-sorting dynamics, and we reflect on the constraints on control that this implies while still working within a thermodynamic framework.



قيم البحث

اقرأ أيضاً

The first direct experimental replication of the Maxwell Demon thought experiment is outlined. The experiment determines the velocity/kinetic energy distribution of the particles in a sample by a novel interpretation of the results from a standard ti me-of-flight (TOF) small angle neutron scattering (SANS) procedure. Perspex at 293 K was subjected to neutrons at 82.2 K. The key result is a TOF velocity distribution curve that is a direct spatial and time-dependent microscopic probe of the velocity distribution of the Perspex nuclei at 293 K. Having this curve, one can duplicate the Demons approach by selecting neutrons at known kinetic energies. One example is given: namely, two reservoirs -- hot and cold reservoirs -- were generated from the 293 K source without disturbing its original 293 K energy distribution.
In the case of fully chaotic systems the distribution of the Poincarerecurrence times is an exponential whose decay rate is the Kolmogorov-Sinai(KS) entropy.We address the discussion of the same problem, the connection between dynamics and thermodyna mics,in the case of sporadic randomness,using the Manneville map as a prototype of this class of processes. We explore the possibility of relating the distribution of Poincare recurrence times to `thermodynamics,in the sense of the KS entropy,also in the case of an inverse power law. This is the dynamic property that Zaslavsly [Phys.Today(8), 39(1999)] finds to be responsible for a striking deviation from ordinary statistical mechanics under the form of Maxwells Demon effect. We show that this way of estabi- lishing a connection between thermodynamics and dynamics is valid only in the case of strong chaos. In the case of sporadic randomness, resulting at long times in the Levy diffusion processes,the sensitivity to initial conditions is initially an inverse pow erlaw,but it becomes exponential in the long-time scale, whereas the distribution of Poincare times keeps its inverse power law forever. We show that a nonextensive thermodynamics would imply the Maxwells Demon effect to be determined by memory and thus to be temporary,in conflict with the dynamic approach to Levy statistics. The adoption of heuristic arguments indicates that this effect,is possible, as a form of genuine equilibrium,after completion of the process of memory erasure.
We propose and analyze Maxwells demon based on a single qubit with avoided level crossing. Its operation cycle consists of adiabatic drive to the point of minimum energy separation, measurement of the qubit state, and conditional feedback. We show th at the heat extracted from the bath at temperature $T$ can ideally approach the Landauer limit of $k_BTln 2$ per cycle even in the quantum regime. Practical demon efficiency is limited by the interplay of Landau-Zener transitions and coupling to the bath. We suggest that an experimental demonstration of the demon is fully feasible using one of the standard superconducting qubits.
216 - J. Bergli , Y. M. Galperin , 2013
We study the entropy and information flow in a Maxwell demon device based on a single-electron transistor with controlled gate potentials. We construct the protocols for measuring the charge states and manipulating the gate voltages which minimizes i rreversibility for (i) constant input power from the environment or (ii) given energy gain. Charge measurement is modeled by a series of detector readouts for time-dependent gate potentials, and the amount of information obtained is determined. The protocols optimize irreversibility that arises due to (i) enlargement of the configuration space on opening the barriers, and (ii) finite rate of operation. These optimal protocols are general and apply to all systems where barriers between different regions can be manipulated.
66 - David H. Wolpert 2019
One of the major resource requirements of computers - ranging from biological cells to human brains to high-performance (engineered) computers - is the energy used to run them. Those costs of performing a computation have long been a focus of researc h in physics, going back to the early work of Landauer. One of the most prominent aspects of computers is that they are inherently nonequilibrium systems. However, the early research was done when nonequilibrium statistical physics was in its infancy, which meant the work was formulated in terms of equilibrium statistical physics. Since then there have been major breakthroughs in nonequilibrium statistical physics, which are allowing us to investigate the myriad aspects of the relationship between statistical physics and computation, extending well beyond the issue of how much work is required to erase a bit. In this paper I review some of this recent work on the `stochastic thermodynamics of computation. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا