ترغب بنشر مسار تعليمي؟ اضغط هنا

An Introductory Review of Information Theory in the Context of Computational Neuroscience

172   0   0.0 ( 0 )
 نشر من قبل Shiro Ikeda Dr.
 تاريخ النشر 2011
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

This paper introduces several fundamental concepts in information theory from the perspective of their origins in engineering. Understanding such concepts is important in neuroscience for two reasons. Simply applying formulae from information theory without understanding the assumptions behind their definitions can lead to erroneous results and conclusions. Furthermore, this century will see a convergence of information theory and neuroscience; information theory will expand its foundations to incorporate more comprehensively biological processes thereby helping reveal how neuronal networks achieve their remarkable information processing abilities.



قيم البحث

اقرأ أيضاً

We develop a theoretical framework for defining and identifying flows of information in computational systems. Here, a computational system is assumed to be a directed graph, with clocked nodes that send transmissions to each other along the edges of the graph at discrete points in time. We are interested in a definition that captures the dynamic flow of information about a specific message, and which guarantees an unbroken information path between appropriately defined inputs and outputs in the directed graph. Prior measures, including those based on Granger Causality and Directed Information, fail to provide clear assumptions and guarantees about when they correctly reflect information flow about a message. We take a systematic approach---iterating through candidate definitions and counterexamples---to arrive at a definition for information flow that is based on conditional mutual information, and which satisfies desirable properties, including the existence of information paths. Finally, we describe how information flow might be detected in a noiseless setting, and provide an algorithm to identify information paths on the time-unrolled graph of a computational system.
75 - Neri Merhav , Igal Sason 2019
We explore a well-known integral representation of the logarithmic function, and demonstrate its usefulness in obtaining compact, easily-computable exact formulas for quantities that involve expectations and higher moments of the logarithm of a posit ive random variable (or the logarithm of a sum of positive random variables). The integral representation of the logarithm is proved useful in a variety of information-theoretic applications, including universal lossless data compression, entropy and differential entropy evaluations, and the calculation of the ergodic capacity of the single-input, multiple-output (SIMO) Gaussian channel with random parameters (known to both transmitter and receiver). This integral representation and its variants are anticipated to serve as a useful tool in additional applications, as a rigorous alternative to the popular (but non-rigorous) replica method (at least in some situations).
A new computational private information retrieval (PIR) scheme based on random linear codes is presented. A matrix of messages from a McEliece scheme is used to query the server with carefully chosen errors. The server responds with the sum of the sc alar multiple of the rows of the query matrix and the files. The user recovers the desired file by erasure decoding the response. Contrary to code-based cryptographic systems, the scheme presented here enables to use truly random codes, not only codes disguised as such. Further, we show the relation to the so-called error subspace search problem and quotient error search problem, which we assume to be difficult, and show that the scheme is secure against attacks based on solving these problems.
In an effort to develop the foundations for a non-stochastic theory of information, the notion of $delta$-mutual information between uncertain variables is introduced as a generalization of Nairs non-stochastic information functional. Several propert ies of this new quantity are illustrated, and used to prove a channel coding theorem in a non-stochastic setting. Namely, it is shown that the largest $delta$-mutual information between received and transmitted codewords over $epsilon$-noise channels equals the $(epsilon, delta)$-capacity. This notion of capacity generalizes the Kolmogorov $epsilon$-capacity to packing sets of overlap at most $delta$, and is a variation of a previous definition proposed by one of the authors. Results are then extended to more general noise models, and to non-stochastic, memoryless, stationary channels. Finally, sufficient conditions are established for the factorization of the $delta$-mutual information and to obtain a single letter capacity expression. Compared to previous non-stochastic approaches, the presented theory admits the possibility of decoding errors as in Shannons probabilistic setting, while retaining a worst-case, non-stochastic character.
The 10th Asia-Europe workshop in Concepts in Information Theory and Communications AEW10 was held in Boppard, Germany on June 21-23, 2017. It is based on a longstanding cooperation between Asian and European scientists. The first workshop was held in Eindhoven, the Netherlands in 1989. The idea of the workshop is threefold: 1) to improve the communication between the scientist in the different parts of the world; 2) to exchange knowledge and ideas; and 3) to pay a tribute to a well respected and special scientist.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا