ﻻ يوجد ملخص باللغة العربية
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining `information. We discuss the extent to which Kolmogorovs and Shannons information theory have a common purpose, and where they are fundamentally different. We indicate how recent developments within the theory allow one to formally distinguish between `structural (meaningful) and `random information as measured by the Kolmogorov structure function, which leads to a mathematical formalization of Occams razor in inductive inference. We end by discussing some of the philosophical implications of the theory.
We propose a novel and theoretical model, blocked and hierarchical variational autoencoder (BHiVAE), to get better-disentangled representation. It is well known that information theory has an excellent explanatory meaning for the network, so we start
The main contribution of this paper is to design an Information Retrieval (IR) technique based on Algorithmic Information Theory (using the Normalized Compression Distance- NCD), statistical techniques (outliers), and novel organization of data base
Constraints on entropies are considered to be the laws of information theory. Even though the pursuit of their discovery has been a central theme of research in information theory, the algorithmic aspects of constraints on entropies remain largely un
This article serves as a brief introduction to the Shannon information theory. Concepts of information, Shannon entropy and channel capacity are mainly covered. All these concepts are developed in a totally combinatorial flavor. Some issues usually n
We prove that mutual information is actually negative copula entropy, based on which a method for mutual information estimation is proposed.