ترغب بنشر مسار تعليمي؟ اضغط هنا

Measuring Quantum Entropy

189   0   0.0 ( 0 )
 نشر من قبل Nirmal Shende
 تاريخ النشر 2017
والبحث باللغة English




اسأل ChatGPT حول البحث

The entropy of a quantum system is a measure of its randomness, and has applications in measuring quantum entanglement. We study the problem of measuring the von Neumann entropy, $S(rho)$, and Renyi entropy, $S_alpha(rho)$ of an unknown mixed quantum state $rho$ in $d$ dimensions, given access to independent copies of $rho$. We provide an algorithm with copy complexity $O(d^{2/alpha})$ for estimating $S_alpha(rho)$ for $alpha<1$, and copy complexity $O(d^{2})$ for estimating $S(rho)$, and $S_alpha(rho)$ for non-integral $alpha>1$. These bounds are at least quadratic in $d$, which is the order dependence on the number of copies required for learning the entire state $rho$. For integral $alpha>1$, on the other hand, we provide an algorithm for estimating $S_alpha(rho)$ with a sub-quadratic copy complexity of $O(d^{2-2/alpha})$. We characterize the copy complexity for integral $alpha>1$ up to constant factors by providing matching lower bounds. For other values of $alpha$, and the von Neumann entropy, we show lower bounds on the algorithm that achieves the upper bound. This shows that we either need new algorithms for better upper bounds, or better lower bounds to tighten the results. For non-integral $alpha$, and the von Neumann entropy, we consider the well known Empirical Young Diagram (EYD) algorithm, which is the analogue of empirical plug-in estimator in classical distribution estimation. As a corollary, we strengthen a lower bound on the copy complexity of the EYD algorithm for learning the maximally mixed state by showing that the lower bound holds with exponential probability (which was previously known to hold with a constant probability). For integral $alpha>1$, we provide new concentration results of certain polynomials that arise in Kerov algebra of Young diagrams.



قيم البحث

اقرأ أيضاً

130 - Gilad Gour , Mark M. Wilde 2018
The von Neumann entropy of a quantum state is a central concept in physics and information theory, having a number of compelling physical interpretations. There is a certain perspective that the most fundamental notion in quantum mechanics is that of a quantum channel, as quantum states, unitary evolutions, measurements, and discarding of quantum systems can each be regarded as certain kinds of quantum channels. Thus, an important goal is to define a consistent and meaningful notion of the entropy of a quantum channel. Motivated by the fact that the entropy of a state $rho$ can be formulated as the difference of the number of physical qubits and the relative entropy distance between $rho$ and the maximally mixed state, here we define the entropy of a channel $mathcal{N}$ as the difference of the number of physical qubits of the channel output with the relative entropy distance between $mathcal{N}$ and the completely depolarizing channel. We prove that this definition satisfies all of the axioms, recently put forward in [Gour, IEEE Trans. Inf. Theory 65, 5880 (2019)], required for a channel entropy function. The task of quantum channel merging, in which the goal is for the receiver to merge his share of the channel with the environments share, gives a compelling operational interpretation of the entropy of a channel. The entropy of a channel can be negative for certain channels, but this negativity has an operational interpretation in terms of the channel merging protocol. We define Renyi and min-entropies of a channel and prove that they satisfy the axioms required for a channel entropy function. Among other results, we also prove that a smoothed version of the min-entropy of a channel satisfies the asymptotic equipartition property.
Over decades traditional information theory of source and channel coding advances toward learning and effective extraction of information from data. We propose to go one step further and offer a theoretical foundation for learning classical patterns from quantum data. However, there are several roadblocks to lay the groundwork for such a generalization. First, classical data must be replaced by a density operator over a Hilbert space. Hence, deviated from problems such as state tomography, our samples are i.i.d density operators. The second challenge is even more profound since we must realize that our only interaction with a quantum state is through a measurement which -- due to no-cloning quantum postulate -- loses information after measuring it. With this in mind, we present a quantum counterpart of the well-known PAC framework. Based on that, we propose a quantum analogous of the ERM algorithm for learning measurement hypothesis classes. Then, we establish upper bounds on the quantum sample complexity quantum concept classes.
82 - Christian Majenz 2018
In this Thesis, several results in quantum information theory are collected, most of which use entropy as the main mathematical tool. *While a direct generalization of the Shannon entropy to density matrices, the von Neumann entropy behaves different ly. A long-standing open question is, whether there are quantum analogues of unconstrained non-Shannon type inequalities. Here, a new constrained non-von-Neumann type inequality is proven, a step towards a conjectured unconstrained inequality by Linden and Winter. *IID quantum state merging can be optimally achieved using the decoupling technique. The one-shot results by Berta et al. and Anshu at al., however, had to bring in additional mathematical machinery. We introduce a natural generalized decoupling paradigm, catalytic decoupling, that can reproduce the aforementioned results when used analogously to the application of standard decoupling in the asymptotic case. *Port based teleportation, a variant of standard quantum teleportation protocol, cannot be implemented perfectly. We prove several lower bounds on the necessary number of output ports N to achieve port based teleportation for given error and input dimension, showing that N diverges uniformly in the dimension of the teleported quantum system, for vanishing error. As a byproduct, a new lower bound for the size of the program register for an approximate universal programmable quantum processor is derived. *In the last part, we give a new definition for information-theoretic quantum non-malleability, strengthening the previous definition by Ambainis et al. We show that quantum non-malleability implies secrecy, analogous to quantum authentication. Furthermore, non-malleable encryption schemes can be used as a primitive to build authenticating encryption schemes. We also show that the strong notion of authentication recently proposed by Garg et al. can be fulfilled using 2-designs.
Non-Gaussian component analysis (NGCA) is a problem in multidimensional data analysis which, since its formulation in 2006, has attracted considerable attention in statistics and machine learning. In this problem, we have a random variable $X$ in $n$ -dimensional Euclidean space. There is an unknown subspace $Gamma$ of the $n$-dimensional Euclidean space such that the orthogonal projection of $X$ onto $Gamma$ is standard multidimensional Gaussian and the orthogonal projection of $X$ onto $Gamma^{perp}$, the orthogonal complement of $Gamma$, is non-Gaussian, in the sense that all its one-dimensional marginals are different from the Gaussian in a certain metric defined in terms of moments. The NGCA problem is to approximate the non-Gaussian subspace $Gamma^{perp}$ given samples of $X$. Vectors in $Gamma^{perp}$ correspond to `interesting directions, whereas vectors in $Gamma$ correspond to the directions where data is very noisy. The most interesting applications of the NGCA model is for the case when the magnitude of the noise is comparable to that of the true signal, a setting in which traditional noise reduction techniques such as PCA dont apply directly. NGCA is also related to dimension reduction and to other data analysis problems such as ICA. NGCA-like problems have been studied in statistics for a long time using techniques such as projection pursuit. We give an algorithm that takes polynomial time in the dimension $n$ and has an inverse polynomial dependence on the error parameter measuring the angle distance between the non-Gaussian subspace and the subspace output by the algorithm. Our algorithm is based on relative entropy as the contrast function and fits under the projection pursuit framework. The techniques we develop for analyzing our algorithm maybe of use for other related problems.
We study quantum coarse-grained entropy and demonstrate that the gap in entropy between local and global coarse-grainings is a natural generalization of entanglement entropy to mixed states and multipartite systems. This quantum correlation entropy $ S^{rm QC}$ is additive over independent systems, is invariant under local unitary operations, measures total nonclassical correlations (vanishing on states with strictly classical correlation), and reduces to the entanglement entropy for bipartite pure states. It quantifies how well a quantum system can be understood via local measurements, and ties directly to non-equilibrium thermodynamics, including representing a lower bound on the quantum part of thermodynamic entropy production. We discuss two other measures of nonclassical correlation to which this entropy is equivalent, and argue that together they provide a unique thermodynamically distinguished measure.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا