ﻻ يوجد ملخص باللغة العربية
We propose a machine learning framework for parameter estimation of single mode Gaussian quantum states. Under a Bayesian framework, our approach estimates parameters of suitable prior distributions from measured data. For phase-space displacement and squeezing parameter estimation, this is achieved by introducing Expectation-Maximization (EM) based algorithms, while for phase parameter estimation an empirical Bayes method is applied. The estimated prior distribution parameters along with the observed data are used for finding the optimal Bayesian estimate of the unknown displacement, squeezing and phase parameters. Our simulation results show that the proposed algorithms have estimation performance that is very close to that of Genie Aided Bayesian estimators, that assume perfect knowledge of the prior parameters. Our proposed methods can be utilized by experimentalists to find the optimum Bayesian estimate of parameters of Gaussian quantum states by using only the observed measurements without requiring any knowledge about the prior distribution parameters.
We calculate the quantum Cramer--Rao bound for the sensitivity with which one or several parameters, encoded in a general single-mode Gaussian state, can be estimated. This includes in particular the interesting case of mixed Gaussian states. We appl
Bayesian analysis is a framework for parameter estimation that applies even in uncertainty regimes where the commonly used local (frequentist) analysis based on the Cramer-Rao bound is not well defined. In particular, it applies when no initial infor
In almost all quantum applications, one of the key steps is to verify that the fidelity of the prepared quantum state meets the expectations. In this paper, we propose a new approach to solve this problem using machine learning techniques. Compared t
We investigate the quantum Cramer-Rao bounds on the joint multiple-parameter estimation with the Gaussian state as a probe. We derive the explicit right logarithmic derivative and symmetric logarithmic derivative operators in such a situation. We com
Artificial neural networks bridge input data into output results by approximately encoding the function that relates them. This is achieved after training the network with a collection of known inputs and results leading to an adjustment of the neuro