ترغب بنشر مسار تعليمي؟ اضغط هنا

Explicit Bounds for Entropy Concentration under Linear Constraints

118   0   0.0 ( 0 )
 نشر من قبل Kostas N. Oikonomou
 تاريخ النشر 2011
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Consider the set of all sequences of $n$ outcomes, each taking one of $m$ values, that satisfy a number of linear constraints. If $m$ is fixed while $n$ increases, most sequences that satisfy the constraints result in frequency vectors whose entropy approaches that of the maximum entropy vector satisfying the constraints. This well-known entropy concentration phenomenon underlies the maximum entropy method. Existing proofs of the concentration phenomenon are based on limits or asymptotics and unrealistically assume that constraints hold precisely, supporting maximum entropy inference more in principle than in practice. We present, for the first time, non-asymptotic, explicit lower bounds on $n$ for a number of variants of the concentration result to hold to any prescribed accuracies, with the constraints holding up to any specified tolerance, taking into account the fact that allocations of discrete units can satisfy constraints only approximately. Again unlike earlier results, we measure concentration not by deviation from the maximum entropy value, but by the $ell_1$ and $ell_2$ distances from the maximum entropy-achieving frequency vector. One of our results holds independently of the alphabet size $m$ and is based on a novel proof technique using the multi-dimensional Berry-Esseen theorem. We illustrate and compare our results using various detailed examples.



قيم البحث

اقرأ أيضاً

We study minimization of a parametric family of relative entropies, termed relative $alpha$-entropies (denoted $mathscr{I}_{alpha}(P,Q)$). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered in stead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative $alpha$-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimization of $mathscr{I}_{alpha}(P,Q)$ over the first argument on a set of probability distributions that constitutes a linear family is studied. Such a minimization generalizes the maximum R{e}nyi or Tsallis entropy principle. The minimizing probability distribution (termed $mathscr{I}_{alpha}$-projection) for a linear family is shown to have a power-law.
We derive a lower bound on the smallest output entropy that can be achieved via vector quantization of a $d$-dimensional source with given expected $r$th-power distortion. Specialized to the one-dimensional case, and in the limit of vanishing distort ion, this lower bound converges to the output entropy achieved by a uniform quantizer, thereby recovering the result by Gish and Pierce that uniform quantizers are asymptotically optimal as the allowed distortion tends to zero. Our lower bound holds for all $d$-dimensional memoryless sources having finite differential entropy and whose integer part has finite entropy. In contrast to Gish and Pierce, we do not require any additional constraints on the continuity or decay of the source probability density function. For one-dimensional sources, the derivation of the lower bound reveals a necessary condition for a sequence of quantizers to be asymptotically optimal as the allowed distortion tends to zero. This condition implies that any sequence of asymptotically-optimal almost-regular quantizers must converge to a uniform quantizer as the allowed distortion tends to zero.
The performance of integer-forcing equalization for communication over the compound multiple-input multipleoutput channel is investigated. An upper bound on the resulting outage probability as a function of the gap to capacity has been derived previo usly, assuming a random precoding matrix drawn from the circular unitary ensemble is applied prior to transmission. In the present work a simple and explicit lower bound on the worst-case outage probability is derived for the case of a system with two transmit antennas and two or more receive antennas, leveraging the properties of the Jacobi ensemble. The derived lower bound is also extended to random space-time precoding, and may serve as a useful benchmark for assessing the relative merits of various algebraic space-time precoding schemes. We further show that the lower bound may be adapted to the case of a $1 times N_t$ system. As an application of this, we derive closed-form bounds for the symmetric-rate capacity of the Rayleigh fading multiple-access channel where all terminals are equipped with a single antenna. Lastly, we demonstrate that the integer-forcing equalization coupled with distributed space-time coding is able to approach these bounds.
In this paper, we revisit the problem of finding the longest systematic-length $k$ for a linear minimum storage regenerating (MSR) code with optimal repair of only systematic part, for a given per-node storage capacity $l$ and an arbitrary number of parity nodes $r$. We study the problem by following a geometric analysis of linear subspaces and operators. First, a simple quadratic bound is given, which implies that $k=r+2$ is the largest number of systematic nodes in the emph{scalar} scenario. Second, an $r$-based-log bound is derived, which is superior to the upper bound on log-base $2$ in the prior work. Finally, an explicit upper bound depending on the value of $frac{r^2}{l}$ is introduced, which further extends the corresponding result in the literature.
97 - Or Ordentlich 2016
Recently, Samorodnitsky proved a strengthened version of Mrs. Gerbers Lemma, where the output entropy of a binary symmetric channel is bounded in terms of the average entropy of the input projected on a random subset of coordinates. Here, this result is applied for deriving novel lower bounds on the entropy rate of binary hidden Markov processes. For symmetric underlying Markov processes, our bound improves upon the best known bound in the very noisy regime. The nonsymmetric case is also considered, and explicit bounds are derived for Markov processes that satisfy the $(1,infty)$-RLL constraint.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا