ﻻ يوجد ملخص باللغة العربية
Dual to the usual noisy channel coding problem, where a noisy (classical or quantum) channel is used to simulate a noiseless one, reverse Shannon theorems concern the use of noiseless channels to simulate noisy ones, and more generally the use of one noisy channel to simulate another. For channels of nonzero capacity, this simulation is always possible, but for it to be efficient, auxiliary resources of the proper kind and amount are generally required. In the classical case, shared randomness between sender and receiver is a sufficient auxiliary resource, regardless of the nature of the source, but in the quantum case the requisite auxiliary resources for efficient simulation depend on both the channel being simulated, and the source from which the channel inputs are coming. For tensor power sources (the quantum generalization of classical IID sources), entanglement in the form of standard ebits (maximally entangled pairs of qubits) is sufficient, but for general sources, which may be arbitrarily correlated or entangled across channel inputs, additional resources, such as entanglement-embezzling states or backward communication, are generally needed. Combining existing and new results, we establish the amounts of communication and auxiliary resources needed in both the classical and quantum cases, the tradeoffs among them, and the loss of simulation efficiency when auxiliary resources are absent or insufficient. In particular we find a new single-letter expression for the excess forward communication cost of coherent feedback simulations of quantum channels (i.e. simulations in which the sender retains what would escape into the environment in an ordinary simulation), on non-tensor-power sources in the presence of unlimited ebits but no other auxiliary resource. Our results on tensor power sources establish a strong converse to the entanglement-assisted capacity theorem.
Donoho and Stark have shown that a precise deterministic recovery of missing information contained in a time interval shorter than the time-frequency uncertainty limit is possible. We analyze this signal recovery mechanism from a physics point of vie
This is the 10th and final chapter of my book on Quantum Information, based on the course I have been teaching at Caltech since 1997. An earlier version of this chapter (originally Chapter 5) has been available on the course website since 1998, but t
The general idea of information entropy provided by C.E. Shannon hangs over everything we do and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information
A rigorous general definition of quantum probability is given, which is valid for elementary events and for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in
Feature selection, in the context of machine learning, is the process of separating the highly predictive feature from those that might be irrelevant or redundant. Information theory has been recognized as a useful concept for this task, as the predi