ترغب بنشر مسار تعليمي؟ اضغط هنا

Fundamental limits in Bayesian thermometry and attainability via adaptive strategies

165   0   0.0 ( 0 )
 نشر من قبل Mohammad Mehboudi
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We investigate the limits of thermometry using quantum probes at thermal equilibrium within the Bayesian approach. We consider the possibility of engineering interactions between the probes in order to enhance their sensitivity, as well as feedback during the measurement process, i.e., adaptive protocols. On the one hand, we obtain an ultimate bound on thermometry precision in the Bayesian setting, valid for arbitrary interactions and measurement schemes, which lower bounds the error with a quadratic (Heisenberg-like) scaling with the number of probes. We develop a simple adaptive strategy that can saturate this limit. On the other hand, we derive a no-go theorem for non-adaptive protocols that does not allow for better than linear (shot-noise-like) scaling even if one has unlimited control over the probes, namely access to arbitrary many-body interactions.



قيم البحث

اقرأ أيضاً

The maximum possible throughput (or the rate of job completion) of a multi-server system is typically the sum of the service rates of individual servers. Recent work shows that launching multiple replicas of a job and canceling them as soon as one co py finishes can boost the throughput, especially when the service time distribution has high variability. This means that redundancy can, in fact, create synergy among servers such that their overall throughput is greater than the sum of individual servers. This work seeks to find the fundamental limit of the throughput boost achieved by job replication and the optimal replication policy to achieve it. While most previous works consider upfront replication policies, we expand the set of possible policies to delayed launch of replicas. The search for the optimal adaptive replication policy can be formulated as a Markov Decision Process, using which we propose two myopic replication policies, MaxRate and AdaRep, to adaptively replicate jobs. In order to quantify the optimality gap of these and other policies, we derive upper bounds on the service capacity, which provide fundamental limits on the throughput of queueing systems with redundancy.
We introduce a general framework for thermometry based on collisional models, where ancillas probe the temperature of the environment through an intermediary system. This allows for the generation of correlated ancillas even if they are initially ind ependent. Using tools from parameter estimation theory, we show through a minimal qubit model that individual ancillas can already outperform the thermal Cramer-Rao bound. In addition, due to the steady-state nature of our model, when measured collectively the ancillas always exhibit superlinear scalings of the Fisher information. This means that even collective measurements on pairs of ancillas will already lead to an advantage. As we find in our qubit model, such a feature may be particularly valuable for weak system-ancilla interactions. Our approach sets forth the notion of metrology in a sequential interactions setting, and may inspire further advances in quantum thermometry.
Controlling and measuring the temperature in different devices and platforms that operate in the quantum regime is, without any doubt, essential for any potential application. In this review, we report the most recent theoretical developments dealing with accurate estimation of very low temperatures in quantum systems. Together with the emerging experimental techniques and developments of measurement protocols, the theory of quantum thermometry will decisively impinge and shape the forthcoming quantum technologies. While current quantum thermometric methods differ greatly depending on the experimental platform, the achievable precision, and the temperature range of interest, the theory of quantum thermometry is built under a unifying framework at the crossroads of quantum metrology, open quantum systems, and quantum many-body physics. At a fundamental level, theoretical quantum thermometry is concerned with finding the ultimate bounds and scaling laws that limit the precision of temperature estimation for systems in and out-of-thermal equilibrium. At a more practical level, it provides tools to formulate precise, yet feasible, thermometric protocols for relevant experimental architectures. Last but not least, the theory of quantum thermometry examines genuine quantum features, like entanglement and coherence, for their exploitation in enhanced-resolution thermometry.
As the minituarization of electronic devices, which are sensitive to temperature, grows apace, sensing of temperature with ever smaller probes is more important than ever. Genuinely quantum mechanical schemes of thermometry are thus expected to be cr ucial to future technological progress. We propose a new method to measure the temperature of a bath using the weak measurement scheme with a finite dimensional probe. The precision offered by the present scheme not only shows similar qualitative features as the usual Quantum Fisher Information based thermometric protocols, but also allows for flexibility over setting the optimal thermometric window through judicious choice of post selection measurements.
Both experimental and computational methods for the exploration of structure, functionality, and properties of materials often necessitate the search across broad parameter spaces to discover optimal experimental conditions and regions of interest in the image space or parameter space of computational models. The direct grid search of the parameter space tends to be extremely time-consuming, leading to the development of strategies balancing exploration of unknown parameter spaces and exploitation towards required performance metrics. However, classical Bayesian optimization strategies based on the Gaussian process (GP) do not readily allow for the incorporation of the known physical behaviors or past knowledge. Here we explore a hybrid optimization/exploration algorithm created by augmenting the standard GP with a structured probabilistic model of the expected systems behavior. This approach balances the flexibility of the non-parametric GP approach with a rigid structure of physical knowledge encoded into the parametric model. The fully Bayesian treatment of the latter allows additional control over the optimization via the selection of priors for the model parameters. The method is demonstrated for a noisy version of the classical objective function used to evaluate optimization algorithms and further extended to physical lattice models. This methodology is expected to be universally suitable for injecting prior knowledge in the form of physical models and past data in the Bayesian optimization framework
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا