ﻻ يوجد ملخص باللغة العربية
The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is a unifying concept in quantum information theory: many information measures such as entropy, conditional entropy, mutual information, and entanglement measures can be realized from it. As such, there has been broad interest in generalizing the notion to further understand its most basic properties, one of which is the data processing inequality. The quantum f-divergence of Petz is one generalization of the quantum relative entropy, and it also leads to other relative entropies, such as the Petz--Renyi relative entropies. In this contribution, I introduce the optimized quantum f-divergence as a related generalization of quantum relative entropy. I prove that it satisfies the data processing inequality, and the method of proof relies upon the operator Jensen inequality, similar to Petzs original approach. Interestingly, the sandwiched Renyi relative entropies are particular examples of the optimized f-divergence. Thus, one benefit of this approach is that there is now a single, unified approach for establishing the data processing inequality for both the Petz--Renyi and sandwiched Renyi relative entropies, for the full range of parameters for which it is known to hold.
Quantum f-divergences are a quantum generalization of the classical notion of f-divergences, and are a special case of Petz quasi-entropies. Many well known distinguishability measures of quantum states are given by, or derived from, f-divergences; s
As a continuation of the paper [20] on standard $f$-divergences, we make a systematic study of maximal $f$-divergences in general von Neumann algebras. For maximal $f$-divergences, apart from their definition based on Haagerups $L^1$-space, we presen
We make a systematic study of standard $f$-divergences in general von Neumann algebras. An important ingredient of our study is to extend Kosakis variational expression of the relative entropy to an arbitary standard $f$-divergence, from which most o
The estimation of an f-divergence between two probability distributions based on samples is a fundamental problem in statistics and machine learning. Most works study this problem under very weak assumptions, in which case it is provably hard. We con
We show that the new quantum extension of Renyis alpha-relative entropies, introduced recently by Muller-Lennert, Dupuis, Szehr, Fehr and Tomamichel, J. Math. Phys. 54, 122203, (2013), and Wilde, Winter, Yang, Commun. Math. Phys. 331, (2014), have an