No Arabic abstract
Neural recordings are nonstationary time series, i.e. their properties typically change over time. Identifying specific changes, e.g. those induced by a learning task, can shed light on the underlying neural processes. However, such changes of interest are often masked by strong unrelated changes, which can be of physiological origin or due to measurement artifacts. We propose a novel algorithm for disentangling such different causes of non-stationarity and in this manner enable better neurophysiological interpretation for a wider set of experimental paradigms. A key ingredient is the repeated application of Stationary Subspace Analysis (SSA) using different temporal scales. The usefulness of our explorative approach is demonstrated in simulations, theory and EEG experiments with 80 Brain-Computer-Interfacing (BCI) subjects.
Neural population activity is theorized to reflect an underlying dynamical structure. This structure can be accurately captured using state space models with explicit dynamics, such as those based on recurrent neural networks (RNNs). However, using recurrence to explicitly model dynamics necessitates sequential processing of data, slowing real-time applications such as brain-computer interfaces. Here we introduce the Neural Data Transformer (NDT), a non-recurrent alternative. We test the NDTs ability to capture autonomous dynamical systems by applying it to synthetic datasets with known dynamics and data from monkey motor cortex during a reaching task well-modeled by RNNs. The NDT models these datasets as well as state-of-the-art recurrent models. Further, its non-recurrence enables 3.9ms inference, well within the loop time of real-time applications and more than 6 times faster than recurrent baselines on the monkey reaching dataset. These results suggest that an explicit dynamics model is not necessary to model autonomous neural population dynamics. Code: https://github.com/snel-repo/neural-data-transformers
ANDy , Activity Networks with Delays, is a discrete time framework aimed at the qualitative modelling of time-dependent activities. The modular and concise syntax makes ANDy suitable for an easy and natural modelling of time-dependent biological systems (i.e., regulatory pathways). Activities involve entities playing the role of activators, inhibitors or products of biochemical network operation. Activities may have given duration, i.e., the time required to obtain results. An entity may represent an object (e.g., an agent, a biochemical species or a family of thereof) with a local attribute, a state denoting its level (e.g., concentration, strength). Entities levels may change as a result of an activity or may decay gradually as time passes by. The semantics of ANDy is formally given via high-level Petri nets ensuring this way some modularity. As main results we show that ANDy systems have finite state representations even for potentially infinite processes and it well adapts to the modelling of toxic behaviours. As an illustration, we present a classification of toxicity properties and give some hints on how they can be verified with existing tools on ANDy systems. A small case study on blood glucose regulation is provided to exemplify the ANDy framework and the toxicity properties.
The technology to generate Spatially Resolved Transcriptomics (SRT) data is rapidly being improved and applied to investigate a variety of biological tissues. The ability to interrogate how spatially localised gene expression can lend new insight to different tissue development is critical, but the appropriate tools to analyse this data are still emerging. This chapter reviews available packages and pipelines for the analysis of different SRT datasets with a focus on identifying spatially variable genes (SVGs) alongside other aims, while discussing the importance of and challenges in establishing a standardised ground truth in the biological data for benchmarking.
The activity of a sparse network of leaky integrate-and-fire neurons is carefully revisited with reference to a regime of a bona-fide asynchronous dynamics. The study is preceded by a finite-size scaling analysis, carried out to identify a setup where collective synchronization is negligible. The comparison between quenched and annealed networks reveals the emergence of substantial differences when the coupling strength is increased, via a scenario somehow reminiscent of a phase transition. For sufficiently strong synaptic coupling, quenched networks exhibit a highly bursting neural activity, well reproduced by a self-consistent approach, based on the assumption that the input synaptic current is the superposition of independent renewal processes. The distribution of interspike intervals turns out to be relatively long-tailed; a crucial feature required for the self-sustainment of the bursting activity in a regime where neurons operate on average (much) below threshold. A semi-quantitative analogy with Ornstein-Uhlenbeck processes helps validating this interpretation. Finally, an alternative explanation in terms of Poisson processes is offered under the additional assumption of mutual correlations among excitatory and inhibitory spikes.
Cryo-electron tomography (cryo-ET) is an emerging technology for the 3D visualization of structural organizations and interactions of subcellular components at near-native state and sub-molecular resolution. Tomograms captured by cryo-ET contain heterogeneous structures representing the complex and dynamic subcellular environment. Since the structures are not purified or fluorescently labeled, the spatial organization and interaction between both the known and unknown structures can be studied in their native environment. The rapid advances of cryo-electron tomography (cryo-ET) have generated abundant 3D cellular imaging data. However, the systematic localization, identification, segmentation, and structural recovery of the subcellular components require efficient and accurate large-scale image analysis methods. We introduce AITom, an open-source artificial intelligence platform for cryo-ET researchers. AITom provides many public as well as in-house algorithms for performing cryo-ET data analysis through both the traditional template-based or template-free approach and the deep learning approach. AITom also supports remote interactive analysis. Comprehensive tutorials for each analysis module are provided to guide the user through. We welcome researchers and developers to join this collaborative open-source software development project. Availability: https://github.com/xulabs/aitom