No Arabic abstract
Entropy is a classical measure to quantify the amount of information or complexity of a system. Various entropy-based measures such as functional and spectral entropies have been proposed in brain network analysis. However, they are less widely used than traditional graph theoretic measures such as global and local efficiencies because either they are not well-defined on a graph or difficult to interpret its biological meaning. In this paper, we propose a new entropy-based graph invariant, called volume entropy. It measures the exponential growth rate of the number of paths in a graph, which is a relevant measure if information flows through the graph forever. We model the information propagation on a graph by the generalized Markov system associated to the weighted edge-transition matrix. We estimate the volume entropy using the stationary equation of the generalized Markov system. A prominent advantage of using the stationary equation is that it assigns certain distribution of weights on the edges of the brain graph, which we call the stationary distribution. The stationary distribution shows the information capacity of edges and the direction of information flow on a brain graph. The simulation results show that the volume entropy distinguishes the underlying graph topology and geometry better than the existing graph measures. In brain imaging data application, the volume entropy of brain graphs was significantly related to healthy normal aging from 20s to 60s. In addition, the stationary distribution of information propagation gives a new insight into the information flow of functional brain graph.
Recent developments in graph theoretic analysis of complex networks have led to deeper understanding of brain networks. Many complex networks show similar macroscopic behaviors despite differences in the microscopic details. Probably two most often observed characteristics of complex networks are scale-free and small-world properties. In this paper, we will explore whether brain networks follow scale-free and small-worldness among other graph theory properties.
Neurodegenerative diseases and traumatic brain injuries (TBI) are among the main causes of cognitive dysfunction in humans. Both manifestations exhibit the extensive presence of focal axonal swellings (FAS). FAS compromises the information encoded in spike trains, thus leading to potentially severe functional deficits. Complicating our understanding of the impact of FAS is our inability to access small scale injuries with non-invasive methods, the overall complexity of neuronal pathologies, and our limited knowledge of how networks process biological signals. Building on Hopfields pioneering work, we extend a model for associative memory to account for FAS and its impact on memory encoding. We calibrate all FAS parameters from biophysical observations of their statistical distribution and size, providing a framework to simulate the effects of brain disorders on memory recall performance. A face recognition example is used to demonstrate and validate the functionality of the novel model. Our results link memory recall ability to observed FAS statistics, allowing for a description of different stages of brain disorders within neuronal networks. This provides a first theoretical model to bridge experimental observations of FAS in neurodegeneration and TBI with compromised memory recall, thus closing the large gap between theory and experiment on how biological signals are processed in damaged, high-dimensional functional networks. The work further lends new insight into positing diagnostic tools to measure cognitive deficits.
Structural covariance analysis is a widely used structural MRI analysis method which characterises the co-relations of morphology between brain regions over a group of subjects. To our knowledge, little has been investigated in terms of the comparability of results between different data sets or the reliability of results over the same subjects in different rescan sessions, image resolutions, or FreeSurf
We consider a pair of stochastic integrate and fire neurons receiving correlated stochastic inputs. The evolution of this system can be described by the corresponding Fokker-Planck equation with non-trivial boundary conditions resulting from the refractory period and firing threshold. We propose a finite volume method that is orders of magnitude faster than the Monte Carlo methods traditionally used to model such systems. The resulting numerical approximations are proved to be accurate, nonnegative and integrate to 1. We also approximate the transient evolution of the system using an Ornstein--Uhlenbeck process, and use the result to examine the properties of the joint output of cell pairs. The results suggests that the joint output of a cell pair is most sensitive to changes in input variance, and less sensitive to changes in input mean and correlation.
A great improvement to the insight on brain function that we can get from fMRI data can come from effective connectivity analysis, in which the flow of information between even remote brain regions is inferred by the parameters of a predictive dynamical model. As opposed to biologically inspired models, some techniques as Granger causality (GC) are purely data-driven and rely on statistical prediction and temporal precedence. While powerful and widely applicable, this approach could suffer from two main limitations when applied to BOLD fMRI data: confounding effect of hemodynamic response function (HRF) and conditioning to a large number of variables in presence of short time series. For task-related fMRI, neural population dynamics can be captured by modeling signal dynamics with explicit exogenous inputs; for resting-state fMRI on the other hand, the absence of explicit inputs makes this task more difficult, unless relying on some specific prior physiological hypothesis. In order to overcome these issues and to allow a more general approach, here we present a simple and novel blind-deconvolution technique for BOLD-fMRI signal. Coming to the second limitation, a fully multivariate conditioning with short and noisy data leads to computational problems due to overfitting. Furthermore, conceptual issues arise in presence of redundancy. We thus apply partial conditioning to a limited subset of variables in the framework of information theory, as recently proposed. Mixing these two improvements we compare the differences between BOLD and deconvolved BOLD level effective networks and draw some conclusions.