Do you want to publish a course? Click here

Transient Dissipation and Structural Costs of Physical Information Transduction

102   0   0.0 ( 0 )
 Publication date 2016
and research's language is English




Ask ChatGPT about the research

A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the Information-Processing Second Law (IPSL): the physical entropy of the universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? Here, we identify a minimal, inescapable transient dissipation engendered by physical information processing not captured by asymptotic rates, but critical to adaptive thermodynamic processes such as found in biological systems. A component of transient dissipation, we also identify an implementation-dependent cost that varies from one physical substrate to another for the same information processing task. Applying these results to producing structured patterns from a structureless information reservoir, we show that retrodictive generators achieve the minimal costs. The results establish the thermodynamic toll imposed by a physical systems structure as it comes to optimally transduce information.



rate research

Read More

We study dynamical reversibility in stationary stochastic processes from an information theoretic perspective. Extending earlier work on the reversibility of Markov chains, we focus on finitary processes with arbitrarily long conditional correlations. In particular, we examine stationary processes represented or generated by edge-emitting, finite-state hidden Markov models. Surprisingly, we find pervasive temporal asymmetries in the statistics of such stationary processes with the consequence that the computational resources necessary to generate a process in the forward and reverse temporal directions are generally not the same. In fact, an exhaustive survey indicates that most stationary processes are irreversible. We study the ensuing relations between model topology in different representations, the processs statistical properties, and its reversibility in detail. A processs temporal asymmetry is efficiently captured using two canonical unifilar representations of the generating model, the forward-time and reverse-time epsilon-machines. We analyze example irreversible processes whose epsilon-machine presentations change size under time reversal, including one which has a finite number of recurrent causal states in one direction, but an infinite number in the opposite. From the forward-time and reverse-time epsilon-machines, we are able to construct a symmetrized, but nonunifilar, generator of a process---the bidirectional machine. Using the bidirectional machine, we show how to directly calculate a processs fundamental information properties, many of which are otherwise only poorly approximated via process samples. The tools we introduce and the insights we offer provide a better understanding of the many facets of reversibility and irreversibility in stochastic processes.
Biological sensory systems react to changes in their surroundings. They are characterized by fast response and slow adaptation to varying environmental cues. Insofar as sensory adaptive systems map environmental changes to changes of their internal degrees of freedom, they can be regarded as computational devices manipulating information. Landauer established that information is ultimately physical, and its manipulation subject to the entropic and energetic bounds of thermodynamics. Thus the fundamental costs of biological sensory adaptation can be elucidated by tracking how the information the system has about its environment is altered. These bounds are particularly relevant for small organisms, which unlike everyday computers operate at very low energies. In this paper, we establish a general framework for the thermodynamics of information processing in sensing. With it, we quantify how during sensory adaptation information about the past is erased, while information about the present is gathered. This process produces entropy larger than the amount of old information erased and has an energetic cost bounded by the amount of new information written to memory. We apply these principles to the E. colis chemotaxis pathway during binary ligand concentration changes. In this regime, we quantify the amount of information stored by each methyl group and show that receptors consume energy in the range of the information-theoretic minimum. Our work provides a basis for further inquiries into more complex phenomena, such as gradient sensing and frequency response.
One of the most fundamental questions one can ask about a pair of random variables X and Y is the value of their mutual information. Unfortunately, this task is often stymied by the extremely large dimension of the variables. We might hope to replace each variable by a lower-dimensional representation that preserves the relationship with the other variable. The theoretically ideal implementation is the use of minimal sufficient statistics, where it is well-known that either X or Y can be replaced by their minimal sufficient statistic about the other while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information Xs minimal sufficient statistic preserves about Y is exactly the information that Ys minimal sufficient statistic preserves about X. As an important corollary, we consider the case where one variable is a stochastic process past and the other its future and the present is viewed as a memoryful channel. In this case, the mutual information is the channel transmission rate between the channels effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.
The Fisher-Rao metric from Information Geometry is related to phase transition phenomena in classical statistical mechanics. Several studies propose to extend the use of Information Geometry to study more general phase transitions in complex systems. However, it is unclear whether the Fisher-Rao metric does indeed detect these more general transitions, especially in the absence of a statistical model. In this paper we study the transitions between patterns in the Gray-Scott reaction-diffusion model using Fisher information. We describe the system by a probability density function that represents the size distribution of blobs in the patterns and compute its Fisher information with respect to changing the two rate parameters of the underlying model. We estimate the distribution non-parametrically so that we do not assume any statistical model. The resulting Fisher map can be interpreted as a phase-map of the different patterns. Lines with high Fisher information can be considered as boundaries between regions of parameter space where patterns with similar characteristics appear. These lines of high Fisher information can be interpreted as phase transitions between complex patterns.
The von Neumann graph entropy is a measure of graph complexity based on the Laplacian spectrum. It has recently found applications in various learning tasks driven by networked data. However, it is computational demanding and hard to interpret using simple structural patterns. Due to the close relation between Lapalcian spectrum and degree sequence, we conjecture that the structural information, defined as the Shannon entropy of the normalized degree sequence, might be a good approximation of the von Neumann graph entropy that is both scalable and interpretable. In this work, we thereby study the difference between the structural information and von Neumann graph entropy named as {em entropy gap}. Based on the knowledge that the degree sequence is majorized by the Laplacian spectrum, we for the first time prove the entropy gap is between $0$ and $log_2 e$ in any undirected unweighted graphs. Consequently we certify that the structural information is a good approximation of the von Neumann graph entropy that achieves provable accuracy, scalability, and interpretability simultaneously. This approximation is further applied to two entropy-related tasks: network design and graph similarity measure, where novel graph similarity measure and fast algorithms are proposed. Our experimental results on graphs of various scales and types show that the very small entropy gap readily applies to a wide range of graphs and weighted graphs. As an approximation of the von Neumann graph entropy, the structural information is the only one that achieves both high efficiency and high accuracy among the prominent methods. It is at least two orders of magnitude faster than SLaQ with comparable accuracy. Our structural information based methods also exhibit superior performance in two entropy-related tasks.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا