Do you want to publish a course? Click here

On the quantum Renyi relative entropies and related capacity formulas

156   0   0.0 ( 0 )
 Added by Mil\\'an Mosonyi
 Publication date 2009
  fields Physics
and research's language is English




Ask ChatGPT about the research

We show that the quantum $alpha$-relative entropies with parameter $alphain (0,1)$ can be represented as generalized cutoff rates in the sense of [I. Csiszar, IEEE Trans. Inf. Theory 41, 26-34, (1995)], which provides a direct operational interpretation to the quantum $alpha$-relative entropies. We also show that various generalizations of the Holevo capacity, defined in terms of the $alpha$-relative entropies, coincide for the parameter range $alphain (0,2]$, and show an upper bound on the one-shot epsilon-capacity of a classical-quantum channel in terms of these capacities.



rate research

Read More

We provide lower and upper bounds on the information transmission capacity of one single use of a classical-quantum channel. The lower bound is expressed in terms of the Hoeffding capacity, that we define similarly to the Holevo capacity, but replacing the relative entropy with the Hoeffding distance. Similarly, our upper bound is in terms of a quantity obtained by replacing the relative entropy with the recently introduced max-relative entropy in the definition of the divergence radius of a channel.
We show that the new quantum extension of Renyis alpha-relative entropies, introduced recently by Muller-Lennert, Dupuis, Szehr, Fehr and Tomamichel, J. Math. Phys. 54, 122203, (2013), and Wilde, Winter, Yang, Commun. Math. Phys. 331, (2014), have an operational interpretation in the strong converse problem of quantum hypothesis testing. Together with related results for the direct part of quantum hypothesis testing, known as the quantum Hoeffding bound, our result suggests that the operationally relevant definition of the quantum Renyi relative entropies depends on the parameter alpha: for alpha<1, the right choice seems to be the traditional definition, whereas for alpha>1 the right choice is the newly introduced version. As a sideresult, we show that the new Renyi alpha-relative entropies are asymptotically attainable by measurements for alpha>1, and give a new simple proof for their monotonicity under completely positive trace-preserving maps.
Many of the traditional results in information theory, such as the channel coding theorem or the source coding theorem, are restricted to scenarios where the underlying resources are independent and identically distributed (i.i.d.) over a large number of uses. To overcome this limitation, two different techniques, the information spectrum method and the smooth entropy framework, have been developed independently. They are based on new entropy measures, called spectral entropy rates and smooth entropies, respectively, that generalize Shannon entropy (in the classical case) and von Neumann entropy (in the more general quantum case). Here, we show that the two techniques are closely related. More precisely, the spectral entropy rate can be seen as the asymptotic limit of the smooth entropy. Our results apply to the quantum setting and thus include the classical setting as a special case.
Quantum Renyi relative entropies provide a one-parameter family of distances between density matrices, which generalizes the relative entropy and the fidelity. We study these measures for renormalization group flows in quantum field theory. We derive explicit expressions in free field theory based on the real time approach. Using monotonicity properties, we obtain new inequalities that need to be satisfied by consistent renormalization group trajectories in field theory. These inequalities play the role of a second law of thermodynamics, in the context of renormalization group flows. Finally, we apply these results to a tractable Kondo model, where we evaluate the Renyi relative entropies explicitly. An outcome of this is that Andersons orthogonality catastrophe can be avoided by working on a Cauchy surface that approaches the light-cone.
138 - Igal Sason 2015
This paper starts by considering the minimization of the Renyi divergence subject to a constraint on the total variation distance. Based on the solution of this optimization problem, the exact locus of the points $bigl( D(Q|P_1), D(Q|P_2) bigr)$ is determined when $P_1, P_2, Q$ are arbitrary probability measures which are mutually absolutely continuous, and the total variation distance between $P_1$ and $P_2$ is not below a given value. It is further shown that all the points of this convex region are attained by probability measures which are defined on a binary alphabet. This characterization yields a geometric interpretation of the minimal Chernoff information subject to a constraint on the total variation distance. This paper also derives an exponential upper bound on the performance of binary linear block codes (or code ensembles) under maximum-likelihood decoding. Its derivation relies on the Gallager bounding technique, and it reproduces the Shulman-Feder bound as a special case. The bound is expressed in terms of the Renyi divergence from the normalized distance spectrum of the code (or the average distance spectrum of the ensemble) to the binomially distributed distance spectrum of the capacity-achieving ensemble of random block codes. This exponential bound provides a quantitative measure of the degradation in performance of binary linear block codes (or code ensembles) as a function of the deviation of their distance spectra from the binomial distribution. An efficient use of this bound is considered.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا