Do you want to publish a course? Click here

Recovery map stability for the Data Processing Inequality

383   0   0.0 ( 0 )
 Added by Eric Carlen
 Publication date 2017
  fields Physics
and research's language is English




Ask ChatGPT about the research

The Data Processing Inequality (DPI) says that the Umegaki relative entropy $S(rho||sigma) := {rm Tr}[rho(log rho - log sigma)]$ is non-increasing under the action of completely positive trace preserving (CPTP) maps. Let ${mathcal M}$ be a finite dimensional von Neumann algebra and ${mathcal N}$ a von Neumann subalgebra if it. Let ${mathcal E}_tau$ be the tracial conditional expectation from ${mathcal M}$ onto ${mathcal N}$. For density matrices $rho$ and $sigma$ in ${mathcal N}$, let $rho_{mathcal N} := {mathcal E}_tau rho$ and $sigma_{mathcal N} := {mathcal E}_tau sigma$. Since ${mathcal E}_tau$ is CPTP, the DPI says that $S(rho||sigma) geq S(rho_{mathcal N}||sigma_{mathcal N})$, and the general case is readily deduced from this. A theorem of Petz says that there is equality if and only if $sigma = {mathcal R}_rho(sigma_{mathcal N} )$, where ${mathcal R}_rho$ is the Petz recovery map, which is dual to the Accardi-Cecchini coarse graining operator ${mathcal A}_rho$ from ${mathcal M} $ to ${mathcal N} $. In it simplest form, our bound is $$S(rho||sigma) - S(rho_{mathcal N} ||sigma_{mathcal N} ) geq left(frac{1}{8pi}right)^{4} |Delta_{sigma,rho}|^{-2} | {mathcal R}_{rho_{mathcal N}} -sigma|_1^4 $$ where $Delta_{sigma,rho}$ is the relative modular operator. We also prove related results for various quasi-relative entropies. Explicitly describing the solutions set of the Petz equation $sigma = {mathcal R}_rho(sigma_{mathcal N} )$ amounts to determining the set of fixed points of the Accardi-Cecchini coarse graining map. Building on previous work, we provide a throughly detailed description of the set of solutions of the Petz equation, and obtain all of our results in a simple self, contained manner.



rate research

Read More

In this work, we provide a strengthening of the data processing inequality for the relative entropy introduced by Belavkin and Staszewski (BS-entropy). This extends previous results by Carlen and Vershynina for the relative entropy and other standard $f$-divergences. To this end, we provide two new equivalent conditions for the equality case of the data processing inequality for the BS-entropy. Subsequently, we extend our result to a larger class of maximal $f$-divergences. Here, we first focus on quantum channels which are conditional expectations onto subalgebras and use the Stinespring dilation to lift our results to arbitrary quantum channels.
Several works have shown that perturbation stable instances of the MAP inference problem in Potts models can be solved exactly using a natural linear programming (LP) relaxation. However, most of these works give few (or no) guarantees for the LP solutions on instances that do not satisfy the relatively strict perturbation stability definitions. In this work, we go beyond these stability results by showing that the LP approximately recovers the MAP solution of a stable instance even after the instance is corrupted by noise. This noisy stable model realistically fits with practical MAP inference problems: we design an algorithm for finding close stable instances, and show that several real-world instances from computer vision have nearby instances that are perturbation stable. These results suggest a new theoretical explanation for the excellent performance of this LP relaxation in practice.
We study the symmetrized noncommutative arithmetic geometric mean inequality introduced(AGM) by Recht and R{e} $$ |frac{(n-d)!}{n!}sumlimits_{{ j_1,...,j_d mbox{ different}} }A_{j_{1}}^*A_{j_{2}}^*...A_{j_{d}}^*A_{j_{d}}...A_{j_{2}}A_{j_{1}} | leq C(d,n) |frac{1}{n} sum_{j=1}^n A_j^*A_j|^d .$$ Complementing the results from Recht and R{e}, we find upper bounds for C(d,n) under additional assumptions. Moreover, using free probability, we show that $C(d, n) > 1$, thereby disproving the most optimistic conjecture from Recht and R{e}.We also prove a deviation result for the symmetrized-AGM inequality which shows that the symmetric inequality almost holds for many classes of random matrices. Finally we apply our results to the incremental gradient method(IGM).
In a recent work, Moslehian and Rajic have shown that the Gruss inequality holds for unital n-positive linear maps $phi:mathcal A rightarrow B(H)$, where $mathcal A$ is a unital C*-algebra and H is a Hilbert space, if $n ge 3$. They also demonstrate that the inequality fails to hold, in general, if $n = 1$ and question whether the inequality holds if $n=2$. In this article, we provide an affirmative answer to this question.
Two new proofs of the Fisher information inequality (FII) using data processing inequalities for mutual information and conditional variance are presented.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا