ﻻ يوجد ملخص باللغة العربية
As a continuation of the paper [20] on standard $f$-divergences, we make a systematic study of maximal $f$-divergences in general von Neumann algebras. For maximal $f$-divergences, apart from their definition based on Haagerups $L^1$-space, we present the general integral expression and the variational expression in terms of reverse tests. From these definition and expressions we prove important properties of maximal $f$-divergences, for instance, the monotonicity inequality, the joint convexity, the lower semicontinuity, and the martingale convergence. The inequality between the standard and the maximal $f$-divergences is also given.
We make a systematic study of standard $f$-divergences in general von Neumann algebras. An important ingredient of our study is to extend Kosakis variational expression of the relative entropy to an arbitary standard $f$-divergence, from which most o
Quantum f-divergences are a quantum generalization of the classical notion of f-divergences, and are a special case of Petz quasi-entropies. Many well known distinguishability measures of quantum states are given by, or derived from, f-divergences; s
A lemma stated by Ke Li in [arXiv:1208.1400] has been used in e.g. [arXiv:1510.04682,arXiv:1706.04590,arXiv:1612.01464,arXiv:1308.6503,arXiv:1602.08898] for various tasks in quantum hypothesis testing, data compression with quantum side information o
The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is a unifying concept in quantum information theory: many information measures such as entropy, conditional entropy, mutual information, and entanglemen
We develop a rigorous and general framework for constructing information-theoretic divergences that subsume both $f$-divergences and integral probability metrics (IPMs), such as the $1$-Wasserstein distance. We prove under which assumptions these div