ﻻ يوجد ملخص باللغة العربية
While the channel capacity reflects a theoretical upper bound on the achievable information transmission rate in the limit of infinitely many bits, it does not characterise the information transfer of a given encoding routine with finitely many bits. In this note, we characterise the quality of a code (i. e. a given encoding routine) by an upper bound on the expected minimum error probability that can be achieved when using this code. We show that for equientropic channels this upper bound is minimal for codes with maximal marginal entropy. As an instructive example we show for the additive white Gaussian noise (AWGN) channel that random coding---also a capacity achieving code---indeed maximises the marginal entropy in the limit of infinite messages.
The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ($Phi$) in the brain is related to
Place cells in the hippocampus are active when an animal visits a certain location (referred to as a place field) within an environment. Grid cells in the medial entorhinal cortex (MEC) respond at multiple locations, with firing fields that form a pe
The problem of estimating an arbitrary random vector from its observation corrupted by additive white Gaussian noise, where the cost function is taken to be the Minimum Mean $p$-th Error (MMPE), is considered. The classical Minimum Mean Square Error
Coprime arrays enable Direction-of-Arrival (DoA) estimation of an increased number of sources. To that end, the receiver estimates the autocorrelation matrix of a larger virtual uniform linear array (coarray), by applying selection or averaging to th
We consider geometrical optimization problems related to optimizing the error probability in the presence of a Gaussian noise. One famous questions in the field is the weak simplex conjecture. We discuss possible approaches to it, and state related c