ترغب بنشر مسار تعليمي؟ اضغط هنا

Detecting Fetal Alcohol Spectrum Disorder in children using Artificial Neural Network

56   0   0.0 ( 0 )
 نشر من قبل Sergio Contreras
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Fetal alcohol spectrum disorder (FASD) is a syndrome whose only difference compared to other childrens conditions is the mothers alcohol consumption during pregnancy. An earlier diagnosis of FASD improving the quality of life of children and adolescents. For this reason, this study focus on evaluating the use of the artificial neural network (ANN) to classify children with FASD and explore how accurate it is. ANN has been used to diagnose cancer, diabetes, and other diseases in the medical area, being a tool that presents good results. The data used is from a battery of tests from children for 5-18 years old (include tests of psychometric, saccade eye movement, and diffusion tensor imaging (DTI)). We study the different configurations of ANN with dense layers. The first one predicts 75% of the outcome correctly for psychometric data. The others models include a feature layer, and we used it to predict FASD using every test individually. The models accurately predict over 70% of the cases, and psychometric and memory guides predict over 88% accuracy. The results suggest that the ANN approach is a competitive and efficient methodology to detect FASD. However, we could be careful in used as a diagnostic technique.

قيم البحث

اقرأ أيضاً

As one of the most important paradigms of recurrent neural networks, the echo state network (ESN) has been applied to a wide range of fields, from robotics to medicine, finance, and language processing. A key feature of the ESN paradigm is its reserv oir --- a directed and weighted network of neurons that projects the input time series into a high dimensional space where linear regression or classification can be applied. Despite extensive studies, the impact of the reservoir network on the ESN performance remains unclear. Combining tools from physics, dynamical systems and network science, we attempt to open the black box of ESN and offer insights to understand the behavior of general artificial neural networks. Through spectral analysis of the reservoir network we reveal a key factor that largely determines the ESN memory capacity and hence affects its performance. Moreover, we find that adding short loops to the reservoir network can tailor ESN for specific tasks and optimize learning. We validate our findings by applying ESN to forecast both synthetic and real benchmark time series. Our results provide a new way to design task-specific ESN. More importantly, it demonstrates the power of combining tools from physics, dynamical systems and network science to offer new insights in understanding the mechanisms of general artificial neural networks.
In this work, a dense recurrent convolutional neural network (DRCNN) was constructed to detect sleep disorders including arousal, apnea and hypopnea using Polysomnography (PSG) measurement channels provided in the 2018 Physionet challenge database. O ur model structure is composed of multiple dense convolutional units (DCU) followed by a bidirectional long-short term memory (LSTM) layer followed by a softmax output layer. The sleep events including sleep stages, arousal regions and multiple types of apnea and hypopnea are manually annotated by experts which enables us to train our proposed network using a multi-task learning mechanism. Three binary cross-entropy loss functions corresponding to sleep/wake, target arousal and apnea-hypopnea/normal detection tasks are summed up to generate our overall network loss function that is optimized using the Adam method. Our model performance was evaluated using two metrics: the area under the precision-recall curve (AUPRC) and the area under the receiver operating characteristic curve (AUROC). To measure our model generalization, 4-fold cross-validation was also performed. For training, our model was applied to full night recording data. Finally, the average AUPRC and AUROC values associated with the arousal detection task were 0.505 and 0.922, respectively on our testing dataset. An ensemble of four models trained on different data folds improved the AUPRC and AUROC to 0.543 and 0.931, respectively. Our proposed algorithm achieved the first place in the official stage of the 2018 Physionet challenge for detecting sleep arousals with AUPRC of 0.54 on the blind testing dataset.
In this paper we investigate the usage of machine learning for interpreting measured sensor values in sensor modules. In particular we analyze the potential of artificial neural networks (ANNs) on low-cost micro-controllers with a few kilobytes of me mory to semantically enrich data captured by sensors. The focus is on classifying temporal data series with a high level of reliability. Design and implementation of ANNs are analyzed considering Feed Forward Neural Networks (FFNNs) and Recurrent Neural Networks (RNNs). We validate the developed ANNs in a case study of optical hand gesture recognition on an 8-bit micro-controller. The best reliability was found for an FFNN with two layers and 1493 parameters requiring an execution time of 36 ms. We propose a workflow to develop ANNs for embedded devices.
Stochastic Gradient Descent (SGD) has proven to be remarkably effective in optimizing deep neural networks that employ ever-larger numbers of parameters. Yet, improving the efficiency of large-scale optimization remains a vital and highly active area of research. Recent work has shown that deep neural networks can be optimized in randomly-projected subspaces of much smaller dimensionality than their native parameter space. While such training is promising for more efficient and scalable optimization schemes, its practical application is limited by inferior optimization performance. Here, we improve on recent random subspace approaches as follows: Firstly, we show that keeping the random projection fixed throughout training is detrimental to optimization. We propose re-drawing the random subspace at each step, which yields significantly better performance. We realize further improvements by applying independent projections to different parts of the network, making the approximation more efficient as network dimensionality grows. To implement these experiments, we leverage hardware-accelerated pseudo-random number generation to construct the random projections on-demand at every optimization step, allowing us to distribute the computation of independent random directions across multiple workers with shared random seeds. This yields significant reductions in memory and is up to 10 times faster for the workloads in question.
195 - Ruixuan Yan , Agung Julius 2021
In this paper, we propose a neuro-symbolic framework called weighted Signal Temporal Logic Neural Network (wSTL-NN) that combines the characteristics of neural networks and temporal logics. Weighted Signal Temporal Logic (wSTL) formulas are recursive ly composed of subformulas that are combined using logical and temporal operators. The quantitative semantics of wSTL is defined such that the quantitative satisfaction of subformulas with higher weights has more influence on the quantitative satisfaction of the overall wSTL formula. In the wSTL-NN, each neuron corresponds to a wSTL subformula, and its output corresponds to the quantitative satisfaction of the formula. We use wSTL-NN to represent wSTL formulas as features to classify time series data. STL features are more explainable than those used in classical methods. The wSTL-NN is end-to-end differentiable, which allows learning of wSTL formulas to be done using back-propagation. To reduce the number of weights, we introduce two techniques to sparsify the wSTL-NN.We apply our framework to an occupancy detection time-series dataset to learn a classifier that predicts the occupancy status of an office room.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا