ﻻ يوجد ملخص باللغة العربية
The minimum rate needed to accurately approximate a product distribution based on an unnormalized informational divergence is shown to be a mutual information. This result subsumes results of Wyner on common information and Han-Verd{u} on resolvability. The result also extends to cases where the source distribution is unknown but the entropy is known.
Renyi divergence is related to Renyi entropy much like Kullback-Leibler divergence is related to Shannons entropy, and comes up in many settings. It was introduced by Renyi as a measure of information that satisfies almost the same axioms as Kullback
This paper develops systematic approaches to obtain $f$-divergence inequalities, dealing with pairs of probability measures defined on arbitrary alphabets. Functional domination is one such approach, where special emphasis is placed on finding the be
We study the impact of delayed channel state information at the transmitters (CSIT) in two-unicast wireless networks with a layered topology and arbitrary connectivity. We introduce a technique to obtain outer bounds to the degrees-of-freedom (DoF) r
For gambling on horses, a one-parameter family of utility functions is proposed, which contains Kellys logarithmic criterion and the expected-return criterion as special cases. The strategies that maximize the utility function are derived, and the co
This paper proposes highly accurate closed-form approximations to channel distributions of two different reconfigurable intelligent surface (RIS)-based wireless system setups, namely, dual-hop RIS-aided (RIS-DH) scheme and RIS-aided transmit (RIS-T)