ﻻ يوجد ملخص باللغة العربية
Recently, Samorodnitsky proved a strengthened version of Mrs. Gerbers Lemma, where the output entropy of a binary symmetric channel is bounded in terms of the average entropy of the input projected on a random subset of coordinates. Here, this result is applied for deriving novel lower bounds on the entropy rate of binary hidden Markov processes. For symmetric underlying Markov processes, our bound improves upon the best known bound in the very noisy regime. The nonsymmetric case is also considered, and explicit bounds are derived for Markov processes that satisfy the $(1,infty)$-RLL constraint.
In the paper, the approximate sequence for entropy of some binary hidden Markov models has been found to have two bound sequences, the low bound sequence and the upper bound sequence. The error bias of the approximate sequence is bound by a geometric
While two hidden Markov process (HMP) resp. quantum random walk (QRW) parametrizations can differ from one another, the stochastic processes arising from them can be equivalent. Here a polynomial-time algorithm is presented which can determine equiva
In this work, novel upper and lower bounds for the capacity of channels with arbitrary constraints on the support of the channel input symbols are derived. As an immediate practical application, the case of multiple-input multiple-output channels wit
Batch codes are a useful notion of locality for error correcting codes, originally introduced in the context of distributed storage and cryptography. Many constructions of batch codes have been given, but few lower bound (limitation) results are know
This paper provides fundamental limits on the sample complexity of estimating dictionaries for tensor data. The specific focus of this work is on $K$th-order tensor data and the case where the underlying dictionary can be expressed in terms of $K$ sm