Do you want to publish a course? Click here

How not to Renyi generalize the Quantum Conditional Mutual Information

126   0   0.0 ( 0 )
 Added by Paul Erker
 Publication date 2014
  fields Physics
and research's language is English
 Authors Paul Erker




Ask ChatGPT about the research

We study the relation between the quantum conditional mutual information and the quantum $alpha$-Renyi divergences. Considering the totally antisymmetric state we show that it is not possible to attain a proper generalization of the quantum conditional mutual information by optimizing the distance in terms of quantum $alpha$-Renyi divergences over the set of all Markov states. The failure of the approach considered arises from the observation that a small quantum conditional mutual information does not imply that the state is close to a quantum Markov state.



rate research

Read More

We prove decomposition rules for quantum Renyi mutual information, generalising the relation $I(A:B) = H(A) - H(A|B)$ to inequalities between Renyi mutual information and Renyi entropy of different orders. The proof uses Beigis generalisation of Reisz-Thorin interpolation to operator norms, and a variation of the argument employed by Dupuis which was used to show chain rules for conditional Renyi entropies. The resulting decomposition rule is then applied to establish an information exclusion relation for Renyi mutual information, generalising the original relation by Hall.
In this paper, we study measures of quantum non-Markovianity based on the conditional mutual information. We obtain such measures by considering multiple parts of the total environment such that the conditional mutual information can be defined in this multipartite setup. The benefit of this approach is that the conditional mutual information is closely related to recovery maps and Markov chains; we also point out its relations with the change of distinguishability. We study along the way the properties of leaked information which is the conditional mutual information that can be back flowed, and we use this leaked information to show that the correlated environment is necessary for nonlocal memory effect.
One way to diagnose chaos in bipartite unitary channels is via the tripartite information of the corresponding Choi state, which for certain choices of the subsystems reduces to the negative conditional mutual information (CMI). We study this quantity from a quantum information-theoretic perspective to clarify its role in diagnosing scrambling. When the CMI is zero, we find that the channel has a special normal form consisting of local channels between individual inputs and outputs. However, we find that arbitrarily low CMI does not imply arbitrary proximity to a channel of this form, although it does imply a type of approximate recoverability of one of the inputs. When the CMI is maximal, we find that the residual channel from an individual input to an individual output is completely depolarizing when the other input is maximally mixed. However, we again find that this result is not robust. We also extend some of these results to the multipartite case and to the case of Haar-random pure input states. Finally, we look at the relationship between tripartite information and its Renyi-2 version which is directly related to out-of-time-order correlation functions. In particular, we demonstrate an arbitrarily large gap between the two quantities.
Recently a new quantum generalization of the Renyi divergence and the corresponding conditional Renyi entropies was proposed. Here we report on a surprising relation between conditional Renyi entropies based on this new generalization and conditional Renyi entropies based on the quantum relative Renyi entropy that was used in previous literature. Our result generalizes the well-known duality relation H(A|B) + H(A|C) = 0 of the conditional von Neumann entropy for tripartite pure states to Renyi entropies of two different kinds. As a direct application, we prove a collection of inequalities that relate different conditional Renyi entropies and derive a new entropic uncertainty relation.
Many of the traditional results in information theory, such as the channel coding theorem or the source coding theorem, are restricted to scenarios where the underlying resources are independent and identically distributed (i.i.d.) over a large number of uses. To overcome this limitation, two different techniques, the information spectrum method and the smooth entropy framework, have been developed independently. They are based on new entropy measures, called spectral entropy rates and smooth entropies, respectively, that generalize Shannon entropy (in the classical case) and von Neumann entropy (in the more general quantum case). Here, we show that the two techniques are closely related. More precisely, the spectral entropy rate can be seen as the asymptotic limit of the smooth entropy. Our results apply to the quantum setting and thus include the classical setting as a special case.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا