ﻻ يوجد ملخص باللغة العربية
The entropy of a quantum system is a measure of its randomness, and has applications in measuring quantum entanglement. We study the problem of measuring the von Neumann entropy, $S(rho)$, and Renyi entropy, $S_alpha(rho)$ of an unknown mixed quantum state $rho$ in $d$ dimensions, given access to independent copies of $rho$. We provide an algorithm with copy complexity $O(d^{2/alpha})$ for estimating $S_alpha(rho)$ for $alpha<1$, and copy complexity $O(d^{2})$ for estimating $S(rho)$, and $S_alpha(rho)$ for non-integral $alpha>1$. These bounds are at least quadratic in $d$, which is the order dependence on the number of copies required for learning the entire state $rho$. For integral $alpha>1$, on the other hand, we provide an algorithm for estimating $S_alpha(rho)$ with a sub-quadratic copy complexity of $O(d^{2-2/alpha})$. We characterize the copy complexity for integral $alpha>1$ up to constant factors by providing matching lower bounds. For other values of $alpha$, and the von Neumann entropy, we show lower bounds on the algorithm that achieves the upper bound. This shows that we either need new algorithms for better upper bounds, or better lower bounds to tighten the results. For non-integral $alpha$, and the von Neumann entropy, we consider the well known Empirical Young Diagram (EYD) algorithm, which is the analogue of empirical plug-in estimator in classical distribution estimation. As a corollary, we strengthen a lower bound on the copy complexity of the EYD algorithm for learning the maximally mixed state by showing that the lower bound holds with exponential probability (which was previously known to hold with a constant probability). For integral $alpha>1$, we provide new concentration results of certain polynomials that arise in Kerov algebra of Young diagrams.
The von Neumann entropy of a quantum state is a central concept in physics and information theory, having a number of compelling physical interpretations. There is a certain perspective that the most fundamental notion in quantum mechanics is that of
Over decades traditional information theory of source and channel coding advances toward learning and effective extraction of information from data. We propose to go one step further and offer a theoretical foundation for learning classical patterns
In this Thesis, several results in quantum information theory are collected, most of which use entropy as the main mathematical tool. *While a direct generalization of the Shannon entropy to density matrices, the von Neumann entropy behaves different
Non-Gaussian component analysis (NGCA) is a problem in multidimensional data analysis which, since its formulation in 2006, has attracted considerable attention in statistics and machine learning. In this problem, we have a random variable $X$ in $n$
We study quantum coarse-grained entropy and demonstrate that the gap in entropy between local and global coarse-grainings is a natural generalization of entanglement entropy to mixed states and multipartite systems. This quantum correlation entropy $