Do you want to publish a course? Click here

Optimized quantum f-divergences

95   0   0.0 ( 0 )
 Added by Mark Wilde
 Publication date 2021
and research's language is English
 Authors Mark M. Wilde




Ask ChatGPT about the research

The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is a unifying concept in quantum information theory: many information measures such as entropy, conditional entropy, mutual information, and entanglement measures can be realized from it. As such, there has been broad interest in generalizing the notion to further understand its most basic properties, one of which is the data processing inequality. The quantum f-divergence of Petz is one generalization of the quantum relative entropy, and it also leads to other relative entropies, such as the Petz--Renyi relative entropies. In this contribution, I introduce the optimized quantum f-divergence as a related generalization of quantum relative entropy. I prove that it satisfies the data processing inequality, and the method of proof relies upon the operator Jensen inequality, similar to Petzs original approach. Interestingly, the sandwiched Renyi relative entropies are particular examples of the optimized f-divergence. Thus, one benefit of this approach is that there is now a single, unified approach for establishing the data processing inequality for both the Petz--Renyi and sandwiched Renyi relative entropies, for the full range of parameters for which it is known to hold.



rate research

Read More

145 - F. Hiai , M. Mosonyi , D. Petz 2010
Quantum f-divergences are a quantum generalization of the classical notion of f-divergences, and are a special case of Petz quasi-entropies. Many well known distinguishability measures of quantum states are given by, or derived from, f-divergences; special examples include the quantum relative entropy, the Renyi relative entropies, and the Chernoff and Hoeffding measures. Here we show that the quantum f-divergences are monotonic under the dual of Schwarz maps whenever the defining function is operator convex. This extends and unifies all previously known monotonicity results. We also analyze the case where the monotonicity inequality holds with equality, and extend Petz reversibility theorem for a large class of f-divergences and other distinguishability measures. We apply our findings to the problem of quantum error correction, and show that if a stochastic map preserves the pairwise distinguishability on a set of states, as measured by a suitable f-divergence, then its action can be reversed on that set by another stochastic map that can be constructed from the original one in a canonical way. We also provide an integral representation for operator convex functions on the positive half-line, which is the main ingredient in extending previously known results on the monotonicity inequality and the case of equality. We also consider some special cases where the convexity of f is sufficient for the monotonicity, and obtain the inverse Holder inequality for operators as an application. The presentation is completely self-contained and requires only standard knowledge of matrix analysis.
274 - Fumio Hiai 2018
As a continuation of the paper [20] on standard $f$-divergences, we make a systematic study of maximal $f$-divergences in general von Neumann algebras. For maximal $f$-divergences, apart from their definition based on Haagerups $L^1$-space, we present the general integral expression and the variational expression in terms of reverse tests. From these definition and expressions we prove important properties of maximal $f$-divergences, for instance, the monotonicity inequality, the joint convexity, the lower semicontinuity, and the martingale convergence. The inequality between the standard and the maximal $f$-divergences is also given.
163 - Fumio Hiai 2018
We make a systematic study of standard $f$-divergences in general von Neumann algebras. An important ingredient of our study is to extend Kosakis variational expression of the relative entropy to an arbitary standard $f$-divergence, from which most of the important properties of standard $f$-divergences follow immediately. In a similar manner we give a comprehensive exposition on the Renyi divergence in von Neumann algebra. Some results on relative hamiltonians formerly studied by Araki and Donald are improved as a by-product.
The estimation of an f-divergence between two probability distributions based on samples is a fundamental problem in statistics and machine learning. Most works study this problem under very weak assumptions, in which case it is provably hard. We consider the case of stronger structural assumptions that are commonly satisfied in modern machine learning, including representation learning and generative modelling with autoencoder architectures. Under these assumptions we propose and study an estimator that can be easily implemented, works well in high dimensions, and enjoys faster rates of convergence. We verify the behavior of our estimator empirically in both synthetic and real-data experiments, and discuss its direct implications for total correlation, entropy, and mutual information estimation.
We show that the new quantum extension of Renyis alpha-relative entropies, introduced recently by Muller-Lennert, Dupuis, Szehr, Fehr and Tomamichel, J. Math. Phys. 54, 122203, (2013), and Wilde, Winter, Yang, Commun. Math. Phys. 331, (2014), have an operational interpretation in the strong converse problem of quantum hypothesis testing. Together with related results for the direct part of quantum hypothesis testing, known as the quantum Hoeffding bound, our result suggests that the operationally relevant definition of the quantum Renyi relative entropies depends on the parameter alpha: for alpha<1, the right choice seems to be the traditional definition, whereas for alpha>1 the right choice is the newly introduced version. As a sideresult, we show that the new Renyi alpha-relative entropies are asymptotically attainable by measurements for alpha>1, and give a new simple proof for their monotonicity under completely positive trace-preserving maps.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا