ترغب بنشر مسار تعليمي؟ اضغط هنا

Is there sufficient evidence for criticality in cortical systems?

66   0   0.0 ( 0 )
 نشر من قبل Alain Destexhe
 تاريخ النشر 2020
  مجال البحث علم الأحياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Many studies have found evidence that the brain operates at a critical point, a processus known as self-organized criticality. A recent paper found remarkable scalings suggestive of criticality in systems as different as neural cultures, anesthetized or awake brains. We point out here that the diversity of these states would question any claimed role of criticality in information processing. Furthermore, we show that two non-critical systems pass all the tests for criticality, a control that was not provided in the original article. We conclude that such false positives demonstrate that the presence of criticality in the brain is still not proven and that we need better methods that scaling analyses.

قيم البحث

اقرأ أيضاً

Complex systems are typically characterized as an intermediate situation between a complete regular structure and a random system. Brain signals can be studied as a striking example of such systems: cortical states can range from highly synchronous a nd ordered neuronal activity (with higher spiking variability) to desynchronized and disordered regimes (with lower spiking variability). It has been recently shown, by testing independent signatures of criticality, that a phase transition occurs in a cortical state of intermediate spiking variability. Here, we use a symbolic information approach to show that, despite the monotonical increase of the Shannon entropy between ordered and disordered regimes, we can determine an intermediate state of maximum complexity based on the Jensen disequilibrium measure. More specifically, we show that statistical complexity is maximized close to criticality for cortical spiking data of urethane-anesthetized rats, as well as for a network model of excitable elements that presents a critical point of a non-equilibrium phase transition.
In low-level sensory systems, it is still unclear how the noisy information collected locally by neurons may give rise to a coherent global percept. This is well demonstrated for the detection of motion in the aperture problem: as luminance of an elo ngated line is symmetrical along its axis, tangential velocity is ambiguous when measured locally. Here, we develop the hypothesis that motion-based predictive coding is sufficient to infer global motion. Our implementation is based on a context-dependent diffusion of a probabilistic representation of motion. We observe in simulations a progressive solution to the aperture problem similar to physio-logy and behavior. We demonstrate that this solution is the result of two underlying mechanisms. First, we demonstrate the formation of a tracking behavior favoring temporally coherent features independent of their texture. Second, we observe that incoherent features are explained away, while coherent information diffuses progressively to the global scale. Most previous models included ad hoc mechanisms such as end-stopped cells or a selection layer to track specific luminance-based features as necessary conditions to solve the aperture problem. Here, we have proved that motion-based predictive coding, as it is implemented in this functional model, is sufficient to solve the aperture problem. This solution may give insights into the role of prediction underlying a large class of sensory computations.
The measurement of present-day temperature of the Cosmic Microwave Background (CMB), $T_0 = 2.72548 pm 0.00057$ K (1$sigma$), made by the Far-InfraRed Absolute Spectrophotometer (FIRAS), is one of the most precise measurements ever made in Cosmology. On the other hand, estimates of the Hubble Constant, $H_0$, obtained from measurements of the CMB temperature fluctuations assuming the standard $Lambda$CDM model exhibit a large ($4.1sigma$) tension when compared with low-redshift, model-independent observations. Recently, some authors argued that a slightly change in $T_0$ could alleviate or solve the $H_0$-tension problem. Here, we investigate evidence for a hotter or colder universe by performing an independent analysis from currently available temperature-redshift $T(z)$ measurements. Our analysis (parametric and non-parametric) shows a good agreement with the FIRAS measurement and a discrepancy of $gtrsim 1.9sigma$ from the $T_0$ values required to solve the $H_0$ tension. This result reinforces the idea that a solution of the $H_0$-tension problem in fact requires either a better understanding of the systematic errors on the $H_0$ measurements or new physics.
We reconstruct the equation of state $w(z)$ of dark energy (DE) using a recently released data set containing 172 type Ia supernovae without assuming the prior $w(z) geq -1$ (in contrast to previous studies). We find that dark energy evolves rapidly and metamorphoses from dust-like behaviour at high $z$ ($w simeq 0$ at $z sim 1$) to a strongly negative equation of state at present ($w lleq -1$ at $z simeq 0$). Dark energy metamorphosis appears to be a robust phenomenon which manifests for a large variety of SNe data samples provided one does not invoke the weak energy prior $rho + p geq 0$. Invoking this prior considerably weakens the rate of growth of $w(z)$. These results demonstrate that dark energy with an evolving equation of state provides a compelling alternative to a cosmological constant if data are analysed in a prior-free manner and the weak energy condition is not imposed by hand.
We provide evidence that cumulative distributions of absolute normalized returns for the $100$ American companies with the highest market capitalization, uncover a critical behavior for different time scales $Delta t$. Such cumulative distributions, in accordance with a variety of complex --and financial-- systems, can be modeled by the cumulative distribution functions of $q$-Gaussians, the distribution function that, in the context of nonextensive statistical mechanics, maximizes a non-Boltzmannian entropy. These $q$-Gaussians are characterized by two parameters, namely $(q,beta)$, that are uniquely defined by $Delta t$. From these dependencies, we find a monotonic relationship between $q$ and $beta$, which can be seen as evidence of criticality. We numerically determine the various exponents which characterize this criticality.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا