Do you want to publish a course? Click here

Optimal Probes for Global Quantum Thermometry

71   0   0.0 ( 0 )
 Added by Wai-Keong Mok
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

Quantum thermodynamics has emerged as a separate sub-discipline, revising the concepts and laws of thermodynamics, at the quantum scale. In particular, there has been a disruptive shift in the way thermometry, and thermometers are perceived and designed. Currently, we face two major challenges in quantum thermometry. First, all of the existing optimally precise temperature probes are local, meaning their operation is optimal only for a narrow range of temperatures. Second, aforesaid optimal local probes mandate complex energy spectrum with immense degeneracy, rendering them impractical. Here, we address these challenges by formalizing the notion of global thermometry leading to the development of optimal temperature sensors over a wide range of temperatures. We observe the emergence of different phases for such optimal probes as the temperature interval is increased. In addition, we show how the best approximation of optimal global probes can be realized in spin chains, implementable in ion traps and quantum dots.



rate research

Read More

Quantum illumination is the task of determining the presence of an object in a noisy environment. We determine the optimal continuous variable states for quantum illumination in the limit of zero object reflectivity. We prove that the optimal single mode state is a coherent state, while the optimal two mode state is the two-mode squeezed-vacuum state. We find that these probes are not optimal at non-zero reflectivity, but remain near optimal. This demonstrates the viability of the continuous variable platform for an experimentally accessible, near optimal quantum illumination implementation.
Precise thermometry for quantum systems is important to the development of new technology, and understanding the ultimate limits to precision presents a fundamental challenge. It is well known that optimal thermometry requires projective measurements of the total energy of the sample. However, this is infeasible in even moderately-sized systems, where realistic energy measurements will necessarily involve some coarse graining. Here, we explore the precision limits for temperature estimation when only coarse-grained measurements are available. Utilizing tools from signal processing, we derive the structure of optimal coarse-grained measurements and find that good temperature estimates can generally be attained even with a small number of outcomes. We apply our results to many-body systems and nonequilibrium thermometry. For the former, we focus on interacting spin lattices, both at and away from criticality, and find that the Fisher-information scaling with system size is unchanged after coarse-graining. For the latter, we consider a probe of given dimension interacting with the sample, followed by a measurement of the probe. We derive an upper bound on arbitrary, nonequilibrium strategies for such probe-based thermometry and illustrate it for thermometry on a Bose-Einstein condensate using an atomic quantum-dot probe.
What is the minimum time required to take the temperature? In this paper, we solve this question for any process where temperature is inferred by measuring a probe (the thermometer) weakly coupled to the sample of interest, so that the probes evolution is well described by a quantum Markovian master equation. Considering the most general control strategy on the probe (adaptive measurements, arbitrary control on the probes state and Hamiltonian), we provide bounds on the achievable measurement precision in a finite amount of time, and show that in many scenarios these fundamental limits can be saturated with a relatively simple experiment. We find that for a general class of sample-probe interactions the scaling of the measurement uncertainty is inversely proportional to the time of the process, a shot-noise like behaviour that arises due to the dissipative nature of thermometry. As a side result, we show that the Lamb shift induced by the probe-sample interaction can play a relevant role in thermometry, allowing for finite measurement resolution in the low-temperature regime (more precisely, the measurement uncertainty decays polynomially with the temperature as $Trightarrow 0$, in contrast to the usual exponential decay with $T^{-1}$). We illustrate these general results for (i) a qubit probe interacting with a bosonic sample, where the role of the Lamb shit is highlighted, and (ii) a collective superradiant coupling between a $N$-qubit probe and a sample, which enables a quadratic decay with $N^2$ of the measurement uncertainty.
The development of a future, global quantum communication network (or quantum internet) will enable high rate private communication and entanglement distribution over very long distances. However, the large-scale performance of ground-based quantum networks (which employ photons as information carriers through optical-fibres) is fundamentally limited by the fibre quality and link length. While these fundamental limits are well established for arbitrary network architectures, the question of how to best design these global architectures remains open. In this work, we take a step forward in addressing this problem by modelling global quantum networks with weakly-regular architectures. Such networks are capable of idealising end-to-end performance whilst remaining sufficiently realistic. In this way, we may investigate the effectiveness of large-scale networks with consistent connective properties, and unveil the global conditions under which end-to-end rates remain analytically computable. Furthermore, by comparing the performance of ideal, ground-based quantum networks with satellite quantum communication protocols, we can establish conditions for which satellites can be used to outperform fibre-based quantum infrastructure.
We seek for the optimal strategy to infer the width $a$ of an infinite potential wells by performing measurements on the particle(s) contained in the well. In particular, we address quantum estimation theory as the proper framework to formulate the problem and find the optimal quantum measurement, as well as to evaluate the ultimate bounds to precision. Our results show that in a static framework the best strategy is to measure position on a delocalized particle, corresponding to a width-independent quantum signal-to-noise ratio (QSNR), which increases with delocalisation. Upon considering time-evolution inside the well, we find that QSNR increases as $t^2$. On the other hand, it decreases with $a$ and thus time-evolution is a metrological resource only when the width is not too large compared to the available time evolution. Finally, we consider entangled probes placed into the well and observe super-additivity of the QSNR: it is the sum of the single-particle QSNRs, plus a positive definite term, which depends on their preparation and may increase with the number of entangled particles. Overall, entanglement represents a resource for the precise characterization of potential wells.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا