Do you want to publish a course? Click here

Information Theory as a Means of Determining the Main Factors Affecting the Processors Architecture

91   0   0.0 ( 0 )
 Added by Anton Rakitsky
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

In this article we are investigating the computers development process in the past decades in order to identify the factors that influence it the most. We describe such factors and use them to predict the direction of further development. To solve these problems, we use the concept of the Computer Capacity, which allows us to estimate the performance of computers theoretically, relying only on the description of its architecture.



rate research

Read More

Guessing Random Additive Noise Decoding (GRAND) is a recently proposed approximate Maximum Likelihood (ML) decoding technique that can decode any linear error-correcting block code. Ordered Reliability Bits GRAND (ORBGRAND) is a powerful variant of GRAND, which outperforms the original GRAND technique by generating error patterns in a specific order. Moreover, their simplicity at the algorithm level renders GRAND family a desirable candidate for applications that demand very high throughput. This work reports the first-ever hardware architecture for ORBGRAND, which achieves an average throughput of up to $42.5$ Gbps for a code length of $128$ at an SNR of $10$ dB. Moreover, the proposed hardware can be used to decode any code provided the length and rate constraints. Compared to the state-of-the-art fast dynamic successive cancellation flip decoder (Fast-DSCF) using a 5G polar $(128,105)$ code, the proposed VLSI implementation has $49times$ more average throughput while maintaining similar decoding performance.
283 - Peter D. Grunwald 2008
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining `information. We discuss the extent to which Kolmogorovs and Shannons information theory have a common purpose, and where they are fundamentally different. We indicate how recent developments within the theory allow one to formally distinguish between `structural (meaningful) and `random information as measured by the Kolmogorov structure function, which leads to a mathematical formalization of Occams razor in inductive inference. We end by discussing some of the philosophical implications of the theory.
88 - Ricky X. F. Chen 2016
This article serves as a brief introduction to the Shannon information theory. Concepts of information, Shannon entropy and channel capacity are mainly covered. All these concepts are developed in a totally combinatorial flavor. Some issues usually not addressed in the literature are discussed here as well. In particular, we show that it seems we can define channel capacity differently which allows us to potentially transmit more messages in a fixed sufficient long time duration. However, for a channel carrying a finite number of letters, the channel capacity unfortunately remains the same as the Shannon limit.
Constraints on entropies are considered to be the laws of information theory. Even though the pursuit of their discovery has been a central theme of research in information theory, the algorithmic aspects of constraints on entropies remain largely unexplored. Here, we initiate an investigation of decision problems about constraints on entropies by placing several different such problems into levels of the arithmetical hierarchy. We establish the following results on checking the validity over all almost-entropic functions: first, validity of a Boolean information constraint arising from a monotone Boolean formula is co-recursively enumerable; second, validity of tight conditional information constraints is in $Pi^0_3$. Furthermore, under some restrictions, validity of conditional information constraints with slack is in $Sigma^0_2$, and validity of information inequality constraints involving max is Turing equivalent to validity of information inequality constraints (with no max involved). We also prove that the classical implication problem for conditional independence statements is co-recursively enumerable.
A systematic study of the central depletion of proton density has been performed in the isotonic chains of nuclei with neutron numbers $N = 20$ and $28$ using different variants of the relativistic mean-field (RMF) models. These models include either the non-linear contributions from the mesons with the coupling constants being density independent or the non-linearity of the mesonic fields realized through the density dependent coupling strengths. The central depletion in deformed nuclei tends to disappear irrespective of the occupancy of $2s_{1/2}$ state in contrast to the spherical nuclei in which the unoccupancy of $2s_{1/2}$ state leads to the central depletion. Due to the differences in the strength of spin-orbit potentials in these models, the central depletions are found to be model dependent. The influence of the central depletion on the neutron-skin thickness is also investigated. It appears that the effects of the central depletion do not percolate far enough to display its finger prints on the trends of the neutron-skin thickness.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا