ترغب بنشر مسار تعليمي؟ اضغط هنا

Rapidly detecting disorder in rhythmic biological signals: A spectral entropy measure to identify cardiac arrhythmias

712   0   0.0 ( 0 )
 نشر من قبل Phillip Staniczenko
 تاريخ النشر 2009
  مجال البحث علم الأحياء فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We consider the use of a running measure of power spectrum disorder to distinguish between the normal sinus rhythm of the heart and two forms of cardiac arrhythmia: atrial fibrillation and atrial flutter. This spectral entropy measure is motivated by characteristic differences in the spectra of beat timings during the three rhythms. We plot patient data derived from ten-beat windows on a disorder map and identify rhythm-defining ranges in the level and variance of spectral entropy values. Employing the spectral entropy within an automatic arrhythmia detection algorithm enables the classification of periods of atrial fibrillation from the time series of patients beats. When the algorithm is set to identify abnormal rhythms within 6 s it agrees with 85.7% of the annotations of professional rhythm assessors; for a response time of 30 s this becomes 89.5%, and with 60 s it is 90.3%. The algorithm provides a rapid way to detect atrial fibrillation, demonstrating usable response times as low as 6 s. Measures of disorder in the frequency domain have practical significance in a range of biological signals: the techniques described in this paper have potential application for the rapid identification of disorder in other rhythmic signals.



قيم البحث

اقرأ أيضاً

Background: Representing biological networks as graphs is a powerful approach to reveal underlying patterns, signatures, and critical components from high-throughput biomolecular data. However, graphs do not natively capture the multi-way relationshi ps present among genes and proteins in biological systems. Hypergraphs are generalizations of graphs that naturally model multi-way relationships and have shown promise in modeling systems such as protein complexes and metabolic reactions. In this paper we seek to understand how hypergraphs can more faithfully identify, and potentially predict, important genes based on complex relationships inferred from genomic expression data sets. Results: We compiled a novel data set of transcriptional host response to pathogenic viral infections and formulated relationships between genes as a hypergraph where hyperedges represent significantly perturbed genes, and vertices represent individual biological samples with specific experimental conditions. We find that hypergraph betweenness centrality is a superior method for identification of genes important to viral response when compared with graph centrality. Conclusions: Our results demonstrate the utility of using hypergraphs to represent complex biological systems and highlight central important responses in common to a variety of highly pathogenic viruses.
Fueled by breakthrough technology developments, the biological, biomedical, and behavioral sciences are now collecting more data than ever before. There is a critical need for time- and cost-efficient strategies to analyze and interpret these data to advance human health. The recent rise of machine learning as a powerful technique to integrate multimodality, multifidelity data, and reveal correlations between intertwined phenomena presents a special opportunity in this regard. However, classical machine learning techniques often ignore the fundamental laws of physics and result in ill-posed problems or non-physical solutions. Multiscale modeling is a successful strategy to integrate multiscale, multiphysics data and uncover mechanisms that explain the emergence of function. However, multiscale modeling alone often fails to efficiently combine large data sets from different sources and different levels of resolution. We show how machine learning and multiscale modeling can complement each other to create robust predictive models that integrate the underlying physics to manage ill-posed problems and explore massive design spaces. We critically review the current literature, highlight applications and opportunities, address open questions, and discuss potential challenges and limitations in four overarching topical areas: ordinary differential equations, partial differential equations, data-driven approaches, and theory-driven approaches. Towards these goals, we leverage expertise in applied mathematics, computer science, computational biology, biophysics, biomechanics, engineering mechanics, experimentation, and medicine. Our multidisciplinary perspective suggests that integrating machine learning and multiscale modeling can provide new insights into disease mechanisms, help identify new targets and treatment strategies, and inform decision making for the benefit of human health.
The eye lens is the most characteristic example of mammalian tissues exhibiting complex colloidal behaviour. In this paper we briefly describe how dynamics in colloidal suspensions can help addressing selected aspects of lens cataract which is ultima tely related to the protein self-assembly under pathological conditions. Results from dynamic light scattering of eye lens homogenates over a wide protein concentration were analyzed and the various relaxation modes were identified in terms of collective and self-diffusion processes. Using this information as an input, the complex relaxation pattern of the intact lens nucleus was rationalized. The model of cold cataract - a phase separation effect of the lens cytoplasm with cooling - was used to simulate lens cataract at in vitro conditions in an effort to determine the parameters of the correlation functions that can be used as reliable indicators of the cataract onset. The applicability of dynamic light scattering as a non-invasive, early-diagnostic tool for ocular diseases is also demonstrated in the light of the findings of the present paper.
153 - Mark C Leake 2021
Here, we discuss a collection of cutting-edge techniques and applications in use today by some of the leading experts in the field of correlative approaches in single-molecule biophysics. A key difference in emphasis, compared with traditional single -molecule biophysics approaches detailed previously, is on the emphasis of the development and use of complex methods which explicitly combine multiple approaches to increase biological insights at the single-molecule level. These so-called correlative single-molecule biophysics methods rely on multiple, orthogonal tools and analysis, as opposed to any one single driving technique. Importantly, they span both in vivo and in vitro biological systems as well as the interfaces between theory and experiment in often highly integrated ways, very different to earlier traditional non-integrative approaches. The first applications of correlative single-molecule methods involved adaption of a range of different experimental technologies to the same biological sample whose measurements were synchronised. However, now we find a greater flora of integrated methods emerging that include approaches applied to different samples at different times and yet still permit useful molecular-scale correlations to be performed. The resultant findings often enable far greater precision of length and time scales of measurements, and a more understanding of the interplay between different processes in the same cell. Many new correlative single-molecule biophysics techniques also include more complex, physiologically relevant approaches as well as increasing number that combine advanced computational methods and mathematical analysis with experimental tools. Here we review the motivation behind the development of correlative single-molecule microscopy methods, its history and recent progress in the field.
218 - Pascal R Buenzli 2014
The formation of new bone involves both the deposition of bone matrix, and the formation of a network of cells embedded within the bone matrix, called osteocytes. Osteocytes derive from bone-synthesising cells (osteoblasts) that become buried in bone matrix during bone deposition. The generation of osteocytes is a complex process that remains incompletely understood. Whilst osteoblast burial determines the density of osteocytes, the expanding network of osteocytes regulates in turn osteoblast activity and osteoblast burial. In this paper, a spatiotemporal continuous model is proposed to investigate the osteoblast-to-osteocyte transition. The aims of the model are (i) to link dynamic properties of osteocyte generation with properties of the osteocyte network imprinted in bone, and (ii) to investigate Marottis hypothesis that osteocytes prompt the burial of osteoblasts when they become covered with sufficient bone matrix. Osteocyte density is assumed in the model to be generated at the moving bone surface by a combination of osteoblast density, matrix secretory rate, rate of entrapment, and curvature of the bone substrate, but is found to be determined solely by the ratio of the instantaneous burial rate and matrix secretory rate. Osteocyte density does not explicitly depend on osteoblast density nor curvature. Osteocyte apoptosis is also included to distinguish between the density of osteocyte lacuna and the density of live osteocytes. Experimental measurements of osteocyte lacuna densities are used to estimate the rate of burial of osteoblasts in bone matrix. These results suggest that: (i) burial rate decreases during osteonal infilling, and (ii) the control of osteoblast burial by osteocytes is likely to emanate as a collective signal from a large group of osteocytes, rather than from the osteocytes closest to the bone deposition front.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا