ترغب بنشر مسار تعليمي؟ اضغط هنا

Systematic errors in direct state measurements with quantum controlled measurements

234   0   0.0 ( 0 )
 نشر من قبل Le Ho Bin
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Le Bin Ho




اسأل ChatGPT حول البحث

Von Neumann measurement framework describes a dynamic interaction between a target system and a probe. In contrast, a quantum controlled measurement framework uses a qubit probe to control the actions of different operators on the target system, and convenient for establishing universal quantum computation. In this work, we use a quantum controlled measurement framework for measuring quantum states directly. We introduce two types of the quantum controlled measurement framework and investigate the systematic error (the bias between the true value and the estimated values) that caused by these types. We numerically investigate the systematic errors, evaluate the confidence region, and investigate the effect of experimental noise that arises from the imperfect detection. Our analysis has important applications in direct quantum state tomography.



قيم البحث

اقرأ أيضاً

Direct state measurement (DSM) is a tomography method that allows for retrieving quantum states wave functions directly. However, a shortcoming of current studies on the DSM is that it does not provide access to noisy quantum systems. Here, we attemp t to fill the gap by investigating the DSM measurement precision that undergoes the state-preparation-and-measurement (SPAM) errors. We manipulate a quantum controlled measurement framework with various configurations and compare the efficiency between them. Under such SPAM errors, the state to be measured lightly deviates from the true state, and the measurement error in the postselection process results in less accurate in the tomography. Our study could provide a reliable tool for SPAM errors tomography and contribute to understanding and resolving an urgent demand for current quantum technologies.
Quantum magnetic field sensing is an important technology for material science and biology. Although experimental imperfections affect the sensitivity, repetitions of the measurements decrease the estimation uncertainty by a square root of the total number of the measurements if there are only statistical errors. However, it is difficult to precisely characterize the coherence time of the system because it fluctuates in time in realistic conditions, which induces systematic errors. In this case, due to residual bias of the measured values, estimation uncertainty cannot be lowered than a finite value even in the limit of the infinite number of measurements. On the basis of the fact that the decoherence dynamics in the so-called Zeno regime are not significant compared to other regimes, we propose a novel but very simple protocol to use measurements in the Zeno regime for reducing systematic errors. Our scheme allows the estimation uncertainty $delta ^2 omega$ to scale as $L^{1/4}$ where $L$ denotes the number of the measurements even when we cannot precisely characterize the coherence time.
164 - Yi-Hsiang Chen 2021
The quantum Zeno effect is well-known for fixing a system to an eigenstate by frequent measurements. It is also known that applying frequent unitary pulses induces a Zeno subspace that can also pin the system to an eigenspace. Both approaches have be en studied as means to maintain a system in a certain subspace. Extending the two concepts, we consider making the measurements/pulses dynamical so that the state can move with the motion of the measurement axis/pulse basis. We show that the system stays in the dynamical eigenbasis when the measurements/pulses are slowly changing. Explicit bounds for the apply rate that guarantees a success probability are provided. In addition, both methods are inherently resilient against non-Markovian noise. Finally, we discuss the similarities and differences between the two methods and their connection to adiabatic quantum computation.
Estimation of quantum states and measurements is crucial for the implementation of quantum information protocols. The standard method for each is quantum tomography. However, quantum tomography suffers from systematic errors caused by imperfect knowl edge of the system. We present a procedure to simultaneously characterize quantum states and measurements that mitigates systematic errors by use of a single high-fidelity state preparation and a limited set of high-fidelity unitary operations. Such states and operations are typical of many state-of-the-art systems. For this situation we design a set of experiments and an optimization algorithm that alternates between maximizing the likelihood with respect to the states and measurements to produce estimates of each. In some cases, the procedure does not enable unique estimation of the states. For these cases, we show how one may identify a set of density matrices compatible with the measurements and use a semi-definite program to place bounds on the states expectation values. We demonstrate the procedure on data from a simulated experiment with two trapped ions.
61 - Lapo Fanciullo 2020
The thermal emission of dust is one of the most important tracers of the interstellar medium: multi-wavelength photometry in the far-infrared (FIR) and submillimeter (submm) can be fitted with a model, providing estimates of the dust mass. The fit re sults depend on the assumed value for FIR/submm opacity, which in most models - due to the scarcity, until recently, of experimental measurements - is extrapolated from shorter wavelengths. Lab measurements of dust analogues, however, show that FIR opacities are usually higher than the values used in models and depend on temperature, which suggests that dust mass estimates may be biased. To test the extent of this bias, we create multi-wavelength synthetic photometry for dusty galaxies at different temperatures and redshifts, using experimental results for FIR/submm dust opacity, then we fit the synthetic data using standard dust models. We find that the dust masses recovered by typical models are overestimated by a factor 2 to 20, depending on how the experimental opacities are treated. If the experimental dust samples are accurate analogues of interstellar dust, therefore, current dust masses are overestimated by up to a factor 20. The implications for our understanding of dust, both Galactic and at high redshift, are discussed.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا